Would be useful to have a comparison to Google's AutoML Tables: https://cloud.google.com/automl-tables/docs/features
Pretty neat, but unfortunately I cannot see a lot of business cases for this. I haven't worked with a ton of models, but especially if you are not dealing with pretty much solved problems like classification, the results won't be great.
First of all, which models are going to be used? How many combinations of hyperparameters are going to be tried? The combinatorial explosion is certain.
And then if you don't know how to prepare the right dataset everything is in vain.
Not really a critique to AWS, but to AutoML in general.
EDIT: After a deeper read it seems it's regressions on textual data only.
Wait, really, I just upload tables of input and the expected output data and it tries various models for me?
Any other places do this?
I assume this won't do things like add convolutional layers if you give it pixel or signal data, right?
Like is this just adding standard layers to a neural net, maybe trying a few activation functions, fiddling with the number of layers and just seeing which give the best results?
In general, If you're interested in looking into AutoML landscape and its adoption here's a Kaggle kernel based on recent Kaggle Survey https://www.kaggle.com/nulldata/carving-out-the-automl-niche...
Do any of these autoML offerings have a way to use the generated model in JavaScript/nodejs? I know of [sklearn-porter](https://github.com/nok/sklearn-porter) which transpiles scikit-learn models to JavaScript among other targets, but not sure if this nicely connects with any of the solutions discussed.
I wonder how this compares features and price to similar products from H2O.ai’s driverless ai, datarobot, bigsquid, etc.
This is rather interesting development. Just last week I saw similar feature in IBM Watson being demoed on IBM Cloud. And now AWS Sagemaker has this capability.
Does this mean that going forward, for small-to-mid size IT companies and Corporates, the demand for Data scientists and ML developers would decrease?
How does the algorithm analyze the results and look for overfitting?
This is the second science-fiction-level announcement from Amazon in as many days. Either they're about to take over the world with effective AGI and Quantum Computation, or they're being a bit silly.
Google's AutoML produces black box models that are only available over a network call. This services seems to produces downloadable models, and a notebook with Python code that creates the model. If that is the case, this is substantially better than GCP's offering.
AWS consistently releases similar products after GCP... but they are much more well-thought-out, as AWS has to support them indefinitely...