オッサンはDesktopが好き

自作PCや機械学習、自転車のことを脈絡無く書きます

C code + CPU is the best choice for Tensorflow inference in general purpose Windows

This article is a translation of Japanese ver.
Original ver. is here*1.

 Hi, this is chang.

 Today, I wrote my personal opinion about how to implement Tensorflow for inference into general purpose Windows computers. To be honest, I wanted to say "general purpose OS". It's shame but I have no experience to develop for Andoroid and iOS. Thus, today's mention applies only for Windows.

0. Background knowledge

 It is common to use Python for developing deep learning, especially for its learning phase. It is because the libraries like Tensorflow or Chainer are published for Python.

 I introduced before that you could use both Python and C code to implement network trained with Linux into Windows computers*2*3*4. If you can set up the same developing environment in Linux, using Python is easier way to write inference. Today's topic is that C code is better if you have some more time for coding.

1. Python or C code

 Python's programing is easy compared to C code. You can reduce the length of source code by using useful libraries like numpy.

 This is good for researchers. Fact, I used script languages like Matlab when I was in University. Python was used in research...more libraries were published...it was used in research more and more... I think the culture "deep learning is written in Python" was established like this.

 By the way, Python is not always useful for software users because:

(1) it's requires complex and technical installation
(2) compatibility with libraries are so naive that long-term stability is difficult
(3) operations like commands are technical
(4) GUI is poor
(5) processing is not fast

 If a computer language is not user friendly, the language is also not friendly for software suppliers. I picked the difficulties of Python as one of software developers:

(A) Difficult to make installer

 It is related to (1) and (2) above. If you are a developer, I guess you can use commands of pip and install the environment via internet. But, for most of software users, it is common that what they have to do to use software is to double-click an installer. Younger people of smartphone era are not possible to know the word "installer." I think it is impossible to let users install Python using commands even if you ease the installation with requirements.txt.

 You can make an installer like Anaconda. Fact, the release of Anaconda I think increased the programmers of Python. Here, let us think about the conflicts with other software.

(B) Conflict with other software

 In general purpose OS, not only one developer's software but also other developer's software are installed. They can use Python. It is highly possible that an installer from others will change the version of Python or libraries and in the result the developer's software cannot work.

 If you develop software for general purpose OS, you have to make an installer that can set up all the resources for operating your software. In addition, the resources from your installer must not be used from others. The exception is .Net framework. But, it's the platform that is provided from Microsoft, the supplier of operation system. I don't think that Python, open source, will be the general platform like .Net framework. I have no intention to use Python as background environment of my software.

(C) Difficult to create GUI

 It is related to (3) and (4) above. If you use kivy etc., you can create GPU app using Python. But it is not good for complex app. I guess in many cases it is hard to develop the whole part of your software only with Python. I mean you have to operate Python from C or C#. It is possible but a little complicated.

(D) Difficult to speed up

 Python is a interpreter type language, so it's processing is not fast. It is a little weird that the language used in state-of-art researches is slow, but actually I sometimes use C code for accelerating heavy processing like image loading*5.

 In my personal view, the speed that researches deal with is not absolute value for uses but relative value to other researches. You can see the phrase like "our research is faster than other researches" in academic papers. In addition, many researches use the high spec GPU of millions yen.

 I heard that researchers in Tokyo University still use FORTRAN. Researches are always run outside user's perspective.

In short...

 Python is not a language designed for users. On the other hand, inference with C code (and tensorflow.dll), I previously introduced*6, can solve the problems abobe. I think this is the only choice.

2. GPU or CPU

 GPU is necessary for learning of neural network. I often have troubles for establishing the developing environment*7. In many cases, the problem is about the drivers of NVIDIA. There is strict compatibilities among deep learning libraries (tensorflow), NVIDIA libraries(CUDA and cudnn), and NVIDIA driver. I have some experiences like my environment suddenly do not work after no-intentional update.

 Window's update is possible to automatically update NVIDIA drivers. Furthermore, users can update drivers or libraries manually to use GPU in other software.

 When the GPU environment is broken, I guess it is very hard to let users rebuild it. It doesn't make a fool of users. Sometimes, it takes me a whole day to recover a broken environment. Imagine that your app is used in the production line and it suddenly stop... You sweat with fear, don't you?

 I wrote that the inference time with CPU and C code for U-Net (256 x 256) was about 500 ms*8. I think it is well for many cases.

 If it is required to speed up inference, you have to let users not use GPU in other software. It means that your app occupy the computer.

3. Inference using embedded or cloud platform

 Although it is self-denial,,, but I can say that it is inappropriate to implement inference in general purpose Window computers. Embedded computers like Rasberry PI as a dedicated resource is more suitable.

 It is also good to use cloud like Google Cloud Platform. Assuming that Google has a strategy to operate large-scale data analysis in cloud, it is natural that environment or information for inference in Windows is poor.

4. Afterwork

 I feel the future that enforces the use of Python in inference(=users' side). Today's article is a personal warning against it. I will investigate for Andoroid and iOS.