neural-amp-modeler

Neural network emulator for guitar amplifiers
Log | Files | Refs | README | LICENSE

full.rst (3661B)


      1 Training locally with the full-featured NAM
      2 ===========================================
      3 
      4 The command line trainer is the full-featured option for training models with 
      5 NAM. To start, you'll want to follow the installation instructions here at
      6 :ref:`installation`.
      7 
      8 After completing this, you will be able to use the full-featured NAM trainer by
      9 typing
     10 
     11 .. code-block:: console
     12 
     13     $ nam-full
     14 
     15 from the command line.
     16 
     17 Training
     18 --------
     19 
     20 Training uses three configuration files to specify:
     21 
     22 1. What data you're training with: (``nam_full_configs/data/``),
     23 2. What model architecture you're using (``nam_full_configs/models/``), and
     24 3. Details of the learning algorithm model (``nam_full_configs/learning/``).
     25 
     26 To train a model of your own gear, you'll need to have a paired input/output
     27 signal from it (either by reamping a pre-recorded test signal or by
     28 simultaneously recording your DI and the effected tone). For your first time, 
     29 you can download the following pre-made files:
     30 
     31 * `input.wav <https://drive.google.com/file/d/1KbaS4oXXNEuh2aCPLwKrPdf5KFOjda8G/view?usp=sharing>`_ 
     32 * `output.wav <https://drive.google.com/file/d/1NrpQLBbCDHyu0RPsne4YcjIpi5-rEP6w/view?usp=sharing>`_ 
     33 
     34 Next, make a file called e.g. ``data.json`` by copying
     35 `nam_full_configs/data/single_pair.json <https://github.com/sdatkinson/neural-amp-modeler/blob/main/nam_full_configs/data/single_pair.json>`_
     36 and editing it to point to your audio files like this: 
     37 
     38 .. code-block:: json
     39 
     40     "common": {
     41         "x_path": "C:\\path\\to\\input.wav",
     42         "y_path": "C:\\path\\to\\output.wav",
     43         "delay": 0
     44     }
     45 
     46 .. note:: If you're providing your own audio files, then you need to provide 
     47     the latency (in samples) between the input and output file. A positive 
     48     number of samples means that the output lags the input by the provided 
     49     number of samples; a negative value means that the output `precedes` the 
     50     input (e.g. because your DAW over-compensated). If you're not sure exactly 
     51     how much latency there is, it's usually a good idea to add a few samples 
     52     just so that the model doesn't need to "predict the future"!
     53 
     54 Next, copy to e.g. ``model.json`` a file for whicever model architecture you want to
     55 use (e.g. 
     56 `nam_full_configs/models/wavenet.json <https://github.com/sdatkinson/neural-amp-modeler/blob/main/nam_full_configs/models/wavenet.json>`_ 
     57 for the standard WaveNet from the simplified trainers), and copy to e.g. 
     58 ``learning.json`` the contents of 
     59 `nam_full_configs/learning/demo.json <https://github.com/sdatkinson/neural-amp-modeler/blob/main/nam_full_configs/learning/demo.json>`_
     60 (for a quick demo run) or
     61 `default.json <https://github.com/sdatkinson/neural-amp-modeler/blob/main/nam_full_configs/learning/default.json>`_
     62 (for something more like a normal use case).
     63 
     64 Next, to train, open up a terminal. Activate your ``nam`` environment and call
     65 the training script with
     66 
     67 .. code-block:: console
     68 
     69     nam-full \
     70     path/to/data.json \
     71     path/to/model.json \
     72     path/to/learning.json \
     73     path/to/outputs
     74 
     75 where the first three input paths are where you saved for files, and you choose
     76 the final output path to save your training results where you'd like.
     77 
     78 .. note:: NAM uses 
     79     `PyTorch Lightning <https://lightning.ai/pages/open-source/>`_
     80     under the hood as a modeling framework, and you can control many of the 
     81     PyTorch Lightning configuration options from 
     82     ``nam_full_configs/learning/default.json``.
     83 
     84 Once training is done, a file called ``model.nam`` is created in the output 
     85 directory. To use it, point 
     86 `the plugin <https://github.com/sdatkinson/NeuralAmpModelerPlugin>`_ at the file
     87 and you're good to go!