Deep Learning in Clojure with Fewer Parentheses than Keras and Python

Need help with your custom Clojure software? I'm open to (selected) contract work.

September 17, 2020

Please share: .

These books fund my work! Please check them out.

Deep Diamond() is a new Deep Learning library written in Clojure. Its goal is to be simple, superfast, and to support both CPU and GPU computing. Try it out!

But it's Clojure, you might say. Why not Python? Python is simple. Python is easy. Python is supported by Google and Facebook. And Clojure… Clojure is a… you know… a… Lisp! Lisp is diiifiiicuuuult. Lisp has soooo many pareentheses. Noone wants to write their DL models like ((((((model.add (((Conv2d etc.))))))))))))))))))))))))). And Clojure is… you know… running on JVM, and JVM is heeeeavy and sloooooow. And no one is using it except from Rich Hickey and his guitar.

Both of these are not true! I'll demonstrate this by direct comparison with the paragon of simplicity and elegance of deep learning in Python - Keras. In this post, I'll take a convolutional neural network from Keras examples.

Below is the relevant model code, first in Keras, and then in Deep Diamond. You can compare them aesthetically. Keras is a high bar to clear, but I think that Deep Diamond's code is even more straightforward.

But the main point is easy to count: parentheses and other punctuation! There, it's easy to show (by simple counting) that Deep Diamond uses fewer parentheses, fewer punctuation symbols overall, fewer constructs, and has less incidental complexity.

I'll argue that its code has nicer layout, too, and a fine sprinkle of colorful symbols in well balanced places, but that's just my subjective preference. Don't look at that nice sparkling thing.

Keras CNN in Python

model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3),
                 activation='relu',
                 input_shape=(28, 28, 1)))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(num_classes, activation='softmax'))

model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=Adam(learning_rate=0.01),
              metrics=['accuracy'])

model.fit(x_train, y_train,
          batch_size=128,
          epochs=12)

Deep Diamond CNN in Clojure

(defonce net-bp
  (network (desc [128 1 28 28] :float :nchw)
           [(convo [32] [3 3] :relu)
               (convo [64] [3 3] :relu)
               (pooling [2 2] :max)
               (dropout)
               (dense [128] :relu)
               (dropout)
               (dense [10] :softmax)]))

(defonce net (init! (net-bp :adam)))

(train net train-images y-train :crossentropy 12 [])

Let's count

How about the number of dreaded parentheses, ( and )?

  Python Clojure
( and ) 48 28
(, ), [, and ] 50 48
Grouped (()) 8 2
))) 2 1
, 17 0
model.add 8 0

As we can see from the table, on every punctuation metric that I could think of, Deep Diamond and Clojure fare better than Keras & Python.

Keras uses almost twice as much parentheses than Deep Diamond. Clojure uses [] for vector literals, which Deep Diamond uses as tensor shapes. You will note that there are more than a few of these, and argue that these are parentheses, too. Fine. Add them up, and Clojure fares slightly better than Python!

A parenthesis here and there is not a problem, but there are horror tales of ((((((( and ))))))) in Lisps. Not in Clojure. See that there is not a single (( in the Clojure example, and only two occurances of )). In Python - there are 8.

Then we come to all additional assorted punctuation in Python: commas, dots, etc. In Clojure, there are none, while in Python there are dozens.

Python is also riddled with redundant stuff such as model.add().

Etc., etc. You get my point.

Speed

But, speed! - you might say. Deep Diamond is faster, too (at least for this model), but that is a nice topic for some other blog post :) Both tools are free so you can try for yourself in the meantime.

The books

Should I mention that the book Deep Learning for Programmers: An Interactive Tutorial with CUDA, OpenCL, DNNL, Java, and Clojure teaches the nuts and bolts of neural networks and deep learning by showing you how Deep Diamond is built, from scratch? In interactive sessions. Each line of code can be executed and the results inspected in the plain Clojure REPL. The best way to master something is to build it yourself!

It' simple. But fast and powerful!

Please subscribe, read the drafts, get the full book soon, and support my work on this free open source library.

Deep Learning in Clojure with Fewer Parentheses than Keras and Python - September 17, 2020 - Dragan Djuric