2025, Nov 28 21:00
Stop argparse crashing on Jupyter kernel flags (-f ...kernel...json) in AWS SageMaker JupyterLab
Seeing 'unrecognized arguments: -f ...kernel...json' in SageMaker JupyterLab? Learn the argparse fix: use parse_args("") to run CLIs in notebooks.
Argparse often plays badly with Jupyter-based workflows. If you run a training script in AWS SageMaker JupyterLab and see your Python process exit with “unrecognized arguments”, you’ve likely hit the classic collision between argparse and the kernel’s own flags. Here is a concise walkthrough of the symptom and the exact fix that keeps your CLI intact without refactoring your training code.
What the error looks like
usage: VICReg training script [-h] [--data_path DATA_PATH] [--exp-dir EXP_DIR] [--log-freq-time LOG_FREQ_TIME] [--arch ARCH] [--mlp MLP] [--epochs EPOCHS] [--batch-size BATCH_SIZE] [--base-lr BASE_LR] [--wd WD] [--sim-coeff SIM_COEFF] [--std-coeff STD_COEFF] [--cov-coeff COV_COEFF] [--num-workers NUM_WORKERS] [--device DEVICE] [--world-size WORLD_SIZE] [--local_rank LOCAL_RANK] [--dist-url DIST_URL] [--rank RANK] VICReg training script: error: unrecognized arguments: -f /home/sagemaker-user/.local/share/jupyter/runtime/kernel-299646e9-32c7-47f8-869a-72e80aa9271b.json
That “-f …kernel…json” fragment is a tell. The Jupyter kernel injects its own command-line parameters when it starts the process, and a vanilla argparse configuration treats those as invalid.
Minimal example that triggers the issue
Below is a typical parser setup for a VICReg-style pretraining entry point. The definitions are fine on their own, but when the script runs inside a notebook or JupyterLab terminal, argparse reads the kernel arguments and fails.
import argparse
from pathlib import Path
def build_cli_spec():
cli = argparse.ArgumentParser(
description="Pretrain a resnet model with VICReg",
add_help=False
)
# Data
cli.add_argument(
"--data_path",
default='AnnualCrop/',
type=Path,
help='dataset path'
)
# Checkpoints
cli.add_argument(
"--exp-dir",
type=Path,
default="./exp",
help='Path to the experiment folder, where all logs/checkpoints will be stored'
)
cli.add_argument(
"--log-freq-time",
type=int,
default=60,
help='Print logs to the stats.txt file every [log-freq-time] seconds'
)
return cli
Why it happens
Argparse consumes sys.argv by default. In JupyterLab (including SageMaker notebooks), the Python process starts with additional flags that point to the active kernel file, visible in the error as “-f /path/to/kernel-...json”. Since those flags are not declared in your parser, argparse treats them as invalid and aborts.
The fix
Make argparse ignore the process-level argv by passing an empty string to parse_args. This way, only the defaults you declared, or values you explicitly pass as a string, are used. The rest of your training logic remains unchanged.
from pathlib import Path
def run(cfg):
# training entry point (placeholder)
pass
if __name__ == "__main__":
cli_obj = build_cli_spec()
opts = cli_obj.parse_args("")
if opts.output_dir:
Path(opts.output_dir).mkdir(parents=True, exist_ok=True)
run(opts)
This mirrors the working pattern documented elsewhere and is sufficient to run the same training script inside SageMaker JupyterLab without argparse tripping over kernel flags. For a longer explanation of using argparse in notebooks, see the reference: https://medium.com/@data.scientist/ipython-trick-how-to-use-argparse-in-ipython-notebooks-a07423ab31fc.
Why you want this in your toolbelt
Self-supervised pretraining experiments often move between local scripts and notebook environments. The exact same CLI can behave differently purely because of how the process is launched. Knowing how to neutralize external argv in Jupyter means you don’t have to litter your codebase with environment checks or duplicate entry points.
Takeaways
If you see “unrecognized arguments: -f …kernel…json” while running a CLI-driven trainer in SageMaker JupyterLab, the core is argparse parsing Jupyter’s own flags. Bypass sys.argv by calling parse_args("") at the entry point, keep your declared options intact, and proceed with training.