There are a number of quirks and gotchas to be aware of when working with nbdev from Colaboratory.

1. Make sure you are in the project directory.

  • Imports, nbdev commands, and command line code (pretty well everything except !pip install) only work when the current working directory is within the project directory. This applies to both the command line notebook and project development notebooks so be sure to %cd into the project directory in both before you start working. Place a %cd path_to_project_directory command in an early cell (after installs but before any imports) of all your notebooks so as not to forget!

  • System errors can reset notebooks and return the current working directory to home (i.e. '/content'). Thus, following a system error, particularly in the command line notebook, always check that you are still in the project directory before continuing (!pwd).

  • Oh, and make sure to use %cd not !cd to change directory!

2. Local imports not recognised/updating as expected.

  • As per the nbdev tutorial, set up autoreload by running the following code near the top of your notebook:

    %load_ext autoreload
    %autoreload 2
  • New local imoprts require a library rebuild to be recognised by Colab: !nbdev_build_lib.

  • If the import is still not working as expected , check the source notebook's corresponding .py file. Is the new code there? Did you #export it from the source notebook?. If so, repeat the build then factory reset the importing notebook BEFORE rerunning the imports cell. T

3. Show_doc doesn't give the expected notebook cell output.

  • The nbdev #show_doc(class.method) used to automatically generate documentation for class methods doesn't give the expected text output in Colab notebook cells, instead just a method definition. However, it works as expected when the documentation is actually built with !nbdev_build_docs so continue to use it as per the nbdev documentation.

4. nbdev_build_docs fails with an error message.

  • Most often, it is a failed library import in one of the notebooks. Make sure all imports are working (see above if not) and if so, rebuild the project library. If the error persists and it is not obvious why, reset the runtime in the notebook executing the !nbdev_build_docs command (a factory reset may be required) and try again.

4. nbdev_diff_nbs generates an output.

  • Code diffs between the notebooks and corresponding .py scripts should be resolved before pushing to GitHub. If nbdev_diff_nbs gives an output, therefore, you'll need to either run !notebook2script() or !script2notebook() as follows:

  • To keep the notebook code:

    # from nbdev.export import notebook2script
    # notebook2script()
  • To keep the script code:

    # from nbdev.sync import script2notebook
    # script2notebook()

5. nbdev_test_nbs reports failed tests

!nbdev_test_nbs runs every notebook in the project top to toe reporting failed tests, any cell raising an exception or if notebook execution stalls.

  • If tests are failing, first check the error output which should give you the notebook the exception arose in and the offending snippet of code.

In addition:

  1. GPU code and instructions being tested in a Colab CPU instance, will cause a cuda runtime error (100) when !nbdev_test_nbs is run.

  2. the code to mount Google Drive in an open unmounted notebook will cause notebook execution to stall (and therefore tests to fail) while awaiting user input. The same piece of code also causes a module not found error when nbdev_test_nbs is run as a GitHub Action during a GitHub push because the required Google library is not installed on GitHub. In this case, the code is still pushed, but a red cross appears on the repo homepage to indicate a problem.

To handle 1. and 2. above, instead of changing the code or test environment, which are otherwise fine, an easier solution is to exclude those cells from testing. Luckily nbdev has a way of doing this - test flags. Test flags and how to use them are covered on the Notebooks page under 'Tests' here and in the nbdev documentation.

6. GitHub integration issues.

  • GitHuB integration works almost exactly the same from Colab notebooks as from the command line. Just make sure 'git_hooks' are set up (i.e. !nbdev_install_git_hooks has been run in the project directory) to strip notebooks of unwanted metadata (you only need do this once) and that the local repo is configured with your GitHub username and email using the following code (again from inside the project directory):

    !git config user.email "email"
    !git config user.name "username"

    If you used clone_new_repo() to set up your local repo then these two steps should have ben done automatically.

  • Nbdev uses Git Actions to run a short yaml script of command line instructions with each git push. These include checking the notebooks can be opend and read (!nbev_read_nbs), checking for diffs (!nbdev_diff_nbs) and running tests on all notebooks with !nbdev_test_nbs). Problems arising don't necessarily cause the push to fail but will alert the problem by placing a 'red cross' rather than 'green tick' next to the latest commit label on the repo page. Click on the red cross to show the GitHub Actions panel and trace the problem. Then back to the project notebooks to debug.

7. Documentation on GitHub Pages.

  • Github pages only shows the index.ipynb page without any other notebook documentation. Make sure you have checked the 'master branch/docs folder' option of 'GitHub Pages' (in the repo settings). The 'docs' folder is ignored if the 'master branch' only option is selected

  • A notebook doesn't appear in the docs sidebar menu. The menu is built from the first # markdown cell of each notebook, where the # must be followed by a unique notebook name e.g # Core. Include this before any other markdown cells and the notebook will appear in the menu under this name.

  • Otherwise, if the docs don't look as expected on GitHub Pages, revise the nbdev syntax for auto-generating documentation from notebook cells and review your code.