
The idea is promising: a no-code platform for orchestrating LLM workflows, with integrations, HTTP requests, and conditional logic. The UI is clean and intuitive. Deployment options like API access and web app generation are convenient, and the self-hosted version gives good flexibility for internal use cases. Review collected by and hosted on G2.com.
- There’s no way to pass hidden input variables to workflows at chat start — they’re always visible to the user, which breaks many personalization or routing use cases.
- Variable size limits in the Cloud version are unreasonably low. Even small JSON objects exceed the allowed length, making it unusable for real AI workflows.
- No support for basic operations like in or includes to check whether a variable exists in a list. You’re forced to use awkward workarounds or abandon logic entirely.
- Support is non-responsive. Even on a paid plan with “priority email support,” we received only generic replies like “read the docs” or “we don’t provide implementation help.” This isn’t acceptable at $59/month.
- Most of our interactions with support (Jerome Yuan and Xinyang Zhang) ended with “this is not supported,” even for very basic questions. No workaround, no roadmap. Review collected by and hosted on G2.com.
The reviewer uploaded a screenshot or submitted the review in-app verifying them as current user.
Validated through a business email account
Organic review. This review was written entirely without invitation or incentive from G2, a seller, or an affiliate.

