What do you like best about Valohai?
The platform is very straightforward and easy to use, and the UI is accessible to a wide range of users regardless of their technical expertise. It's easy to get started and learning its intricacies doesn't take long at all. Just write some yaml, store some environment variables, connect to your repo, and go to town on your projects.
In terms of collaboration features, it's not lacking, as a team we can work on shared workspaces meaning all the people involved in the same project can access and work on the same experiments. Due to how it integrates with Git, it also provides version control and traceability. It's incredibly easy to share setups with other team members as anyone can go and review, debug, or replicate previously set up tasks or pipelines. This also enables a collaborative workflow between data scientists and data engineers, where we can contribute to the different stages of the project at the same time, which streamlines the dev process.
It has an efficient hyperparameter tuning setup making it a useful tool for fine-tuning. No matter your flavor of framework, whether you're team PyTorch or team Tensorflow, the support for multiple frameworks ensures you don't have to make significant changes to your tech stack.
When you define the parameters for your tuning run, it immediately gives you a number of how many combinations your parameters result in, which is really handy as it enables users to be conscious about the number of runs and costs associated with them. In the cases where you need to do heavy grid searches, the auto-scaling queue handles all the runs, which is one less thing you have to worry about.
The team behind Valohai is incredibly lovely and the customer support is knowledgeable, friendly, and responsive. I really like that they encourage us to get in touch directly with them whenever we come up with any issues. They're great at troubleshooting the issues we encounter and are quick to offer solutions that work. Review collected by and hosted on G2.com.