Skip to main content

Stop Writing ML Scripts! AutoML in Azure Trained This Model for Me

· 10 min read

In this guide we’ll walk through a YouTube-ready flow for Azure Machine Learning:

  1. A quick mental model of how training works in Azure ML
  2. What a notebook actually is (in simple terms)
  3. How to train a no-code AutoML classification model
  4. How to deploy and test a real-time endpoint
  5. How this compares to the official command job tutorial from Microsoft docs
  6. How to fix the annoying AxiosError: Request failed with status code 400 when uploading your credit card CSV in Azure ML Studio

You can use this both as a blog and as the script outline for a YouTube video.


1. The scenario: Credit card default prediction

We’ll work with a credit card default dataset (for example, a version of the “Default of Credit Card Clients” dataset).

Each row represents a client and includes:

  • Demographic features – age, education, marital status, etc.
  • Credit information – credit limit, bill amounts, previous payments, etc.
  • A target label such as default_payment_next_month indicating whether the client defaulted.

Our goal:

Train a model that predicts whether a customer is likely to default next month, then deploy it as a real-time endpoint in Azure ML.


2. How Azure ML thinks about training

Azure Machine Learning gives you multiple ways to submit training jobs:

  • Command jobs via CLI / Python SDK (you write a script, define an environment, and submit a job)
  • Automated ML (AutoML) where Azure ML tries different algorithms and hyperparameters for you
  • Pipelines that chain multiple steps like data prep, training, evaluation, and deployment

Under the hood, all of these are just jobs running on a compute target (CPU/GPU cluster, workstation, etc.), logging metrics and registering models.

The official “Train a model” tutorial focuses on a command job using a training script to build a classifier for credit card default.

In this article we’ll take an alternative route:

  • Use no-code AutoML in Azure ML Studio
  • Still train a classification model
  • Still deploy to a managed online endpoint
  • But with less boilerplate and a more visual flow

This is perfect if you want your video to be original, but still consistent with Microsoft’s official patterns.


3. What is a notebook?

You’ll likely reference notebooks at some point in your video, so here’s a simple way to explain them.

A notebook is an interactive document where you can write code in small chunks called cells, run each cell independently, and see the output right below it.

Instead of one big script, you get a step-by-step lab journal:

  • Code cells for experiments
  • Text cells for explanations
  • Charts and tables right next to the code that produced them

In Azure ML, notebooks run inside your workspace, which makes them ideal for exploring data, testing models, and documenting your ML experiments.

4. Prerequisites

Before starting:

  • An Azure subscription

  • An Azure Machine Learning workspace

  • Access to Azure ML Studio at:

    👉 https://ml.azure.com

  • A CSV file of your credit card data – for example, a downloaded copy of:

  https://azuremlexamples.blob.core.windows.net/datasets/credit_card/default_of_credit_card_clients.csv

5. Step 1 – Open ml.azure.com and create compute

  1. In your browser, navigate to https://ml.azure.com.

  2. Sign in and select your Azure ML workspace.

  3. In the left menu, open Compute → Compute clusters.

  4. Click + New and configure:

    • Virtual machine size: e.g. Standard_DS11_v2 (CPU)
    • Minimum number of nodes: 0 (so it scales down when idle)
    • Maximum number of nodes: 1 (enough for a demo)
  5. Click Create and wait for the cluster to be ready.

This compute will be used by the AutoML job later.


6. Step 2 – Upload the credit card CSV as a data asset

We’ll create a tabular data asset in Studio backed by your CSV.

  1. From https://ml.azure.com, on the left menu go to Data.

  2. Click + CreateFrom local files.

  3. In the wizard:

    • Data type: Tabular
    • Name: e.g. credit-card-defaults
    • Description: “Credit card clients with default label” (optional)
    • File format: CSV
    • Column headers: select File has headers
    • Delimiter: select Comma (,)
    • Encoding: select UTF-8
  4. Upload your CSV file and let Studio infer the schema.

  5. Review the detected columns and data types, then click Create.

You can mention in the video: “I’m choosing Tabular data, CSV format, telling Azure ML that the first row has headers, and that the delimiter is a comma. These options must match the actual structure of your file.”


7. Fixing AxiosError: Request failed with status code 400 during CSV upload

If you see an error like:

AxiosError: Request failed with status code 400
at zt (manualChunk_data-fetch-93c92e5c.js:19:31127)
at XMLHttpRequest.D (manualChunk_data-fetch-93c92e5c.js:20:2163)

it usually means Studio rejected the upload request, often due to file format / header / encoding issues rather than a pure network problem.

7.1. Sanity-check the CSV locally

Open the file in Excel, VS Code, or a text editor and verify:

  1. Encoding = UTF-8

    Save the file as UTF-8 CSV (no BOM if you get that option). Avoid unusual encodings.

  2. Exactly one header row

    • The first line should be the column names.
    • No extra header/title line above it.
    • No completely empty headers like ,, causing unnamed columns.
  3. Clean header names

    • Prefer simple names: LIMIT_BAL, SEX, EDUCATION, AGE, PAY_0, etc.
    • Avoid problematic characters like #, ;, or quotes in header names.
    • Spaces are usually fine, but underscores are safer.
  4. Consistent delimiter

    • Confirm you’re using commas throughout (no mixing ; and ,).
  5. File size sanity check

    • For testing, you can copy the first 5–10k rows into a new file and try uploading that.

7.2. Use “From local files” (not direct URL)

If you were using the dataset URL directly inside the UI, switch to this flow instead:

  1. Download the file from the URL to your local machine.
  2. In Studio (https://ml.azure.com): Data → + Create → From local files → Tabular.
  3. Set CSV / headers / comma delimiter as described above.
  4. Upload again.

If you want to ingest from cloud storage, the recommended path is:

  • Upload the CSV into Azure Blob Storage associated with your workspace.
  • Create a datastore pointing to that container.
  • Then create a data asset From datastore rather than from an external public URL.

7.3. Inspect the detailed error message (optional but helpful)

If it still fails:

  1. Press F12 in your browser to open Developer Tools.
  2. Go to the Network tab and re-try the upload.
  3. Look for the request that returns 400 Bad Request (often something like data/datasets).
  4. Click it and check the Response tab – Azure ML often returns a more descriptive message (e.g. invalid header, unsupported encoding, schema inference error).

That message will usually tell you exactly what Studio disliked.

7.4. “Repair” the CSV

A quick “reset” that fixes many problems:

  1. Open the original CSV in Excel.

  2. Ensure:

    • First row is a single header row.
    • No summary rows at bottom (like “Total”).
  3. Save AsCSV UTF-8 (Comma delimited).

  4. Re-upload this cleaned-up file via From local files at https://ml.azure.com.

Once the upload succeeds and you see a valid tabular data asset, you’re ready for AutoML.


8. Step 3 – Create an AutoML classification job (no code)

Now we’ll train a classification model using Automated ML in Studio. This is where you explicitly choose Classification and the other key options.

  1. From https://ml.azure.com, in the left menu go to Automated ML.

  2. Click + New automated ML job.

  3. Basic settings

    • Select dataset: choose credit-card-defaults.
    • New experiment name: e.g. credit-card-defaults-automl.
    • Job name: auto-generated or customize if you like.
  4. Click Next to go to Task type & settings.

8.1. Choose Classification and key options

On the Task type & settings screen:

  1. Task type:

    • From the dropdown, select Classification.
    • Mention in voiceover: “We are predicting a yes/no default, so this is clearly a classification problem.”
  2. Target column:

    • Choose your label column, e.g. default_payment_next_month.
    • This tells Azure ML what you want to predict.
  3. Additional configuration

    • Primary metric: choose AUC weighted or Accuracy (good default metrics for binary classification).
    • Training job time: optionally set a maximum time (e.g. 15–30 minutes for a demo).
    • Max concurrent iterations: for a small cluster, 1–2 is usually fine.
    • Validation type: leave the default (often “Auto” or k-fold) unless you need something specific.

Click Next to go to the Compute step.

8.2. Select compute

On the Compute screen:

  1. Choose Existing compute cluster.
  2. Select the cluster you created earlier, e.g. cpu-cluster-demo.
  3. Review all options and click Submit.

Behind the scenes, Azure ML will:

  • Automatically try multiple algorithms and hyperparameters
  • Perform data splits / cross-validation to estimate performance
  • Log metrics, charts, and logs for each child run

9. Step 4 – Review the best model

After the AutoML job finishes:

  1. Open the AutoML run in Studio (Automated ML → Experiments → your experiment).
  2. You’ll see a Leaderboard of models, sorted by the primary metric you chose (e.g. AUC).
  3. Click on the Best model row to open details.

You can now inspect:

  • Overall metrics (Accuracy, AUC, F1, etc.)
  • The confusion matrix to see how well the model distinguishes default vs non-default
  • Feature importance (if available) to understand which features drive predictions (e.g. past payment history, credit limit)

The best model is automatically registered in the workspace. This registered model is what we’ll deploy next.


10. Step 5 – Deploy the best model to a real-time endpoint

From the Best model view:

  1. Click Deploy → Real-time endpoint.

  2. Fill in deployment settings:

    • Endpoint name: e.g. credit-default-endpoint
    • Deployment name: e.g. blue
    • Select a suitable compute for the endpoint (a small CPU instance is fine for demos).
  3. Click Review + create, then Create.

Azure ML will:

  • Package the model and environment into a container
  • Host it behind a managed online endpoint
  • Expose a REST API for scoring requests

Once deployment completes:

  1. In Studio, go to Endpoints → Real-time endpoints and open your endpoint.

  2. Go to the Test tab:

    • Paste a JSON object representing a single client (same feature names as in your dataset).
    • Click Test.
  3. You should see a JSON response with:

    • Predicted class (e.g. 0 = no default, 1 = default)
    • Probabilities or scores

Azure ML also generates sample code snippets (Python, curl) to call this endpoint from your applications.


11. How this differs from the official “Train a model” tutorial

Microsoft’s official “Train a model” tutorial typically uses the command job approach:

  • You write a Python training script that reads the credit card CSV and trains a classifier.
  • You define an environment with dependencies.
  • You submit a command job via CLI/SDK.
  • You register the model and then deploy it.

In this article we:

  • Used no-code AutoML in Azure ML Studio, starting at https://ml.azure.com
  • Explicitly chose Classification as the task type and a target column for defaults
  • Let the service handle algorithm selection and hyperparameter tuning
  • Still registered a model in the workspace
  • Still deployed a real-time endpoint you can call via REST