DeepPavlov for developers: # 2 setup and deployment

All articles in the series:



1. DeepPavlov for developers: # 1 NLP tools and create chat bots

2. DeepPavlov for developers: # 2 setup and deployment



Hello! In the first article in our series, we learned what DeepPavlov is, what library models are ready for use without prior training, and how to run REST servers with them. Before starting the training of models, we will talk about the various possibilities of deploying DeepPavlov models and some features of the library settings.



We agree that all the library startup scripts are executed in the Python environment with the DeepPavlov library installed (for installation, see the first article, for virtualenv , read here ). The examples in this article do not require knowledge of Python syntax.







Modes of interaction with NLP models DeepPavlov



DeepPavlov currently supports 4 ways to interact with NLP models (both pre-trained and user-created):







The <config_path> parameter (required in all four modes) can take either the full path to the model config file or the name of the config file without an extension. In the latter case, the model config must be registered in the library.



Configs of all models supplied with DeepPavlov are registered in the library. The list of supplied models can be found in the MODELS section of the DeepPavlov documentation , their configs can be found here .



GPU usage



In any of the above modes, NLP models are initialized, which are based on neural networks. This makes them quite demanding on computing resources. You can improve the performance of models by using the GPU. To do this, you will need an nVidia graphics card with sufficient video memory (depending on the model you are running) and a supported version of the CUDA framework. You can find all the necessary information on launching DeepPavlov models on the GPU here .



Library settings files



All library settings are contained in three files:





By default, the configuration files are located in <deep_pavlov_root> / utils / settings , where <deep_pavlov_root> is the installation directory of DeepPavlov (usually this is lib / python <XX> / site-packages / deeppavlov in virtual environment). Using command



 python -m deeppavlov.settings
      
      





You can find out the exact path to the directory with the settings files. You can also set the path to a directory convenient for you by specifying it in the environment variable DP_SETTINGS_PATH . After the first run of the above command (a server with any trained DeepPavlov model), files from the default directory will be copied to the directory from DP_SETTINGS_PATH . Team



 python -m deeppavlov.settings -d
      
      





resets settings by copying settings files from the default directory on top of files to DP_SETTINGS_PATH .



From the settings of DeepPavlov you should pay attention to:





Running pre-trained models in Docker



Any pre-trained DeepPavlov model can be launched in the Docker container in the REST service mode. Detailed instructions are in our repositories on DockerHub: here for the CPU, here for the GPU. API models in containers are fully consistent with the description from the previous article.



Deeppavlov cloud



To make it easier to work with pre-trained NLP models from DeepPavlov, we started providing them in SaaS mode. To use the models, you need to register in our service and get a token in the Tokens section of your personal account. The API documentation is in the Info section. Under one token, you can send up to 1000 requests to the model.



Currently, the service is launched in the Alpha version and its use is provided free of charge. Further, the set and format for the provision of models will be expanded in accordance with user requests. The request form can be found at the bottom of the Demo page.



The following models are now available in DeepPavlov Cloud:





Conclusion



In this article, we got acquainted with the features of configuration and deployment of DeepPavlov models, learned about Docker DP images and the possibility of free access to DP models as SaaS.



In the next article, we will train a simple DeepPavlov model on our dataset. And do not forget that DeepPavlov has a forum - ask your questions regarding the library and models. Thanks for attention!



All Articles