Feed Aggregator Page 667
Rendered on Tue, 29 Dec 2020 06:25:36 GMT
Rendered on Tue, 29 Dec 2020 06:25:36 GMT
via Elm - Latest posts by @rupert Rupert Smith on Mon, 28 Dec 2020 09:38:26 GMT
Before going down the route described below, I would check that the “maximum call stack size exceeded” error is not coming from your own Elm code. What are you doing with the 250K elements once you receive them through the port?
The trick that may help is - I believe it is possible to pass a File
through a port:
I have never done this myself so cannot confirm it works. Perhaps someone else can verify?
via Elm - Latest posts by @system system on Mon, 28 Dec 2020 06:21:45 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @system system on Mon, 28 Dec 2020 03:51:31 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @paulh Paul Hollyer on Mon, 28 Dec 2020 00:56:50 GMT
Hello, welcome to Elm.
Can you show your code? I’ve just sent through an array of 50 million integers as a flag, and traced out List.length
on the incoming array, so it doesn’t look like it’s the size of the array that’s the problem.
Might be how you’re manipulating the data when it’s received? Code examples would help others help you
Edit: I’ve also just sent through a List (List Int)
of length 250,000, where each of the inner List
s contain 100 Int
s, so a total of 25,000,000 elements.
When received by Elm I ran the following without a problem, albeit it a took a few seconds:
Debug.log "a" (List.concat flags.a |> List.map String.fromInt |> List.length)
This logged a: 25000000
to the console.
via Elm - Latest posts by @noise-machines Thomas Bailey on Sun, 27 Dec 2020 23:35:50 GMT
Hi folks!
New Elm user here. I’m a very experienced front-end engineer with a preference for functional languages. (I’ve studied Haskell and Clojure in my free time for the fun of it.) I also do generative art, normally in JS. I was excited to try out Elm as an alternative for my art projects, but ran into a problem sending large arrays through ports.
Here’s the background:
I’ve got a library that loads images into arrays of integers in JS. In this particular case, I was trying to load an image of a 500x500 heightmap. The image contains 500 x 500 = 250,000 pixels, and each pixel has four channels (red, green, blue, and alpha). So the whole array contained a million elements.
I’ve worked with arrays this size in JS with no problem, but when I tried to pass it through a port to my Elm code, I got a “maximum call stack size exceeded” error. When I tried passing a shorter array through, it worked fine.
Interacting with the image data is a key part of the code for this art project, so I don’t want to just handle that part of the app in JS. There’d be too much back and forth through ports.
Do you all have any thoughts or ideas for getting around this? Maybe I could send smaller pieces of the array through one at a time, and gradually concatenate the full array on the Elm side? I also tried passing the data through as a flag, and ran into the same problem.
via Elm - Latest posts by @system system on Sun, 27 Dec 2020 16:10:26 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @system system on Sun, 27 Dec 2020 10:53:17 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Planet Lisp by on Sat, 26 Dec 2020 19:04:59 GMT
This is a small and simple documentation builder. It was removed from Quicklisp in 2014 because this project is SBCL only, but I've added it to the Ultralisp and you can test it after upgrade to the latest version.
CL-API is suitable for building a reference for third-party libraries if they don't have their own documentation. But lack of ability to process handwritten chapters and work with package inferred systems, make it unusable for 40ants projects.
As always, I've created a template repository for you.
Here is an example project's documentation built with CL-API:
https://cl-doc-systems.github.io/cl-api/
Use this template if you are making a small library which needs autogenerated API reference.
Also, you'll find a "Pros & Cons" section in the README:
https://github.com/cl-doc-systems/cl-api
Here you will find template projects for other documentation systems.
Choose what is more suite your needs:
via Elm - Latest posts by @system system on Sat, 26 Dec 2020 17:43:00 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @hanifhefaz Hanif Hefaz on Sat, 26 Dec 2020 14:55:33 GMT
Hello Everyone,
I have a function called wordsDict, which generates dictionary or a list of dictionaries from some strings, for each of its word. I have designed some tests and they work fine.
the function is:
wordsDict : List String -> List Dict String Int
wordsDict =
List.map (tokenize >> toHistogram)
for example, the test:
test "test making dictionary from the data with two sentences." <|
\() ->
let
dataText =
[ "test", "testing" ]
in
Expect.equal [ Dict.fromList [ ( "test", 1 ) ], Dict.fromList [ ( "testing", 1 ) ] ]
(dataText |> Main.wordsDict)
This test runs and passes as expected.
Now if I understand it clearly, fuzz testing is used to randomly give the input a 100 times with different strings.
How can I implement a fuzz test for this function? Will fuzz testing change these two strings?
[ "test", "testing" ]
Thank You.
via Planet Lisp by on Mon, 21 Dec 2020 05:10:24 GMT
So I've plowed some of my vacation time into polishing up/hacking on some old projects. Including house
, the web server I complained was garbage, but still had one distinct advantage over other Common Lisp webservers. Namely; because it's the only natively implemented one, it will work out-of-the-box, without issue, anywhere you can install quicklisp
and a LISP it runs on.
This hacking attempt was aimed at addressing the complaint. Most of the major-overhaul
branch was aimed at making the code more readable and sensical. Making handlers
and http-type
s much simpler, both implementationally and conceptually. But I want to throw at least a little effort at performance. With that in mind, I wanted a preliminary benchmark. I'm following fukamachi
s' procedure for woo
. Note that, since house
is a single-threaded server (for now), I'm only doing single-threaded benchmarks.
; SLIME 2.26
CL-USER> (ql:quickload :house)
To load "house":
Load 1 ASDF system:
house
; Loading "house"
.....
(:HOUSE)
CL-USER> (in-package :house)
#<PACKAGE "HOUSE">
HOUSE> (define-handler (root) () "Hello world!")
#<HANDLER-TABLE {1004593CF3}>
HOUSE> (house:start 5000)
inaimathi@this:~$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.01ms 5.85ms 204.63ms 98.73%
Req/Sec 2.64k 0.89k 7.22k 62.16%
104779 requests in 10.10s, 30.58MB read
Socket errors: connect 0, read 104775, write 0, timeout 0
Requests/sec: 10374.93
Transfer/sec: 3.03MB
inaimathi@this:~$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 2.74ms 19.05ms 408.54ms 98.18%
Req/Sec 2.58k 0.85k 4.64k 57.39%
102543 requests in 10.10s, 29.92MB read
Socket errors: connect 0, read 102539, write 0, timeout 0
Requests/sec: 10152.79
Transfer/sec: 2.96MB
inaimathi@this:~$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 4.56ms 59.54ms 1.66s 99.27%
Req/Sec 3.10k 1.83k 9.56k 76.72%
103979 requests in 10.01s, 30.34MB read
Socket errors: connect 0, read 103979, write 0, timeout 4
Requests/sec: 10392.46
Transfer/sec: 3.03MB
inaimathi@this:~$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 8.49ms 85.22ms 1.66s 98.81%
Req/Sec 3.23k 2.16k 11.90k 81.01%
102236 requests in 10.01s, 29.83MB read
Socket errors: connect 0, read 102232, write 0, timeout 4
Requests/sec: 10215.87
Transfer/sec: 2.98MB
inaimathi@this:~$
So that puts house
comfortably in the same league as Tornado on PyPy or the node.js
server. This is not a bad league to be in, but I want to see if I can do better.
defmethod
is a thing I was seemingly obsessed with when I wrote house
. This isn't necessarily a bad thing from the legibility perspective; because they have type annotations, it's clearer what an expected input is from a reading of the code. However, there's two disadvantages to using method
s where you don't have to.
no-defined-method
error on weird input, rather than something more descriptive and specific the way you probably would when using a normal functionThe first point is a nit, but the second one is worth dealing with in the context of a library that should probably perform reasonably well at least some of the time. The cause of that problem is that method
s can't be inline
d. Because the point of them is to dispatch on a type-table of their arguments at runtime, they can't do their work at compile-time to inline the result without some serious trickery1. Today, I'm avoiding trickery and just re-writing every method
in house
that I can into a function, usually by using etypecase
.
Some of these are trivial conversions
;;; house.lisp
...
-(defmethod start ((port integer) &optional (host usocket:*wildcard-host*))
+(defun start (port &optional (host usocket:*wildcard-host*))
+ (assert (integerp port))
...
-(defmethod process-ready ((ready stream-server-usocket) (conns hash-table))
- (setf (gethash (socket-accept ready :element-type 'octet) conns) nil))
-
-(defmethod process-ready ((ready stream-usocket) (conns hash-table))
+(defun process-ready (ready conns)
+ (assert (hash-table-p conn))
+ (etypecase ready
+ (stream-server-usocket (setf (gethash (socket-accept ready :element-type 'octet) conns) nil))
+ (stream-usocket
...
-(defmethod parse-cookies ((cookie string))
+(defun parse-cookies (cookie)
+ (assert (stringp cookie))
...
-(defmethod handle-request! ((sock usocket) (req request))
+(defun handle-request! (sock req)
...
-(defmethod error! ((err response) (sock usocket) &optional instance)
- (declare (ignorable instance))
+(defun error! (err sock)
,,,
;;; session.lisp
...
-(defmethod new-session-hook! ((callback function))
+(defun new-session-hook! (callback)
...
-(defmethod poke! ((sess session))
+(defun poke! (sess)
...
;;; util.lisp
...
-(defmethod path->uri ((path pathname) &key stem-from)
+(defun path->uri (path &key stem-from)
...
-(defmethod path->mimetype ((path pathname))
+(defun path->mimetype (path)
...
Some are slightly more complicated. In particular, parse
looks like it would conflate two entirely separate functions, but on inspection, we know the type of its argument at every call site.
./house.lisp:46: (setf (parameters (request buf)) (nconc (parse buf) (parameters (request buf)))))
./house.lisp:68: do (multiple-value-bind (parsed expecting) (parse buffer)
./house.lisp:92:(defmethod parse ((str string))
./house.lisp:110:(defmethod parse ((buf buffer))
./house.lisp:116: (parse str))))
So, we can convert parse
to two separate, named functions. write!
is basically the same situation.
;;; house.lisp
...
-(defmethod parse ((str string))
+(defun parse-request-string (str)
...
-(defmethod parse ((buf buffer))
+(defun parse-buffer (buf)
...
-(defmethod write! ((res response) (stream stream))
+(defun write-response! (res stream)
...
-(defmethod write! ((res sse) (stream stream))
+(defun write-sse! (res stream)
...
Not pictured; changes at each call-site to call the correct one.
The parse-params
method is a bit harder to tease out. Because it looks like it genuinely is one polymorphic function. Again, though, on closer inspection of the fully internal to house
call-sites makes it clear that we almost always know what we're passing as arguments at compile-time.
./house.lisp:78:(defmethod parse-params (content-type (params null)) nil)
./house.lisp:79:(defmethod parse-params (content-type (params string))
./house.lisp:83:(defmethod parse-params ((content-type (eql :application/json)) (params string))
./house.lisp:107: (setf (parameters req) (parse-params nil parameters))
./house.lisp:113: (parse-params
(->keyword (cdr (assoc :content-type (headers (request buf)))))
str)
That "almost" is going to be a slight pain though; we need to do a runtime dispatch inside of parse-buffer
to figure out whether we're parsing JSON or a param-encoded string.
...
-(defmethod parse-params (content-type (params null)) nil)
-(defmethod parse-params (content-type (params string))
+(defun parse-param-string (params)
(loop for pair in (split "&" params)
- for (name val) = (split "=" pair)
- collect (cons (->keyword name) (or val ""))))
-
-(defmethod parse-params ((content-type (eql :application/json)) (params string))
- (cl-json:decode-json-from-string params))
+ for (name val) = (split "=" pair)
+ collect (cons (->keyword name) (or val ""))))
...
- (parse-params
- (->keyword (cdr (assoc :content-type (headers (request buf)))))
- str)
- (parse str))))
+ (if (eq :application/json (->keyword (cdr (assoc :content-type (headers (request buf))))))
+ (cl-json:decode-json-from-string str)
+ (parse-param-string str))
+ (parse-request-string str))))
...
The last one is going to be a headache. The lookup
method is meant to be a general accessor, and has a setf
method defined. I'm not going that way right now; lets see if we gained anything with our current efforts.
Second verse same as the first.
; SLIME 2.26
CL-USER> (ql:quickload :house)
To load "house":
Load 1 ASDF system:
house
; Loading "house"
.....
(:HOUSE)
CL-USER> (in-package :house)
#<PACKAGE "HOUSE">
HOUSE> (define-handler (root) () "Hello world!")
#<HANDLER-TABLE {1004593CF3}>
HOUSE> (house:start 5000)
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.96ms 4.02ms 76.87ms 98.43%
Req/Sec 2.70k 0.98k 7.57k 73.83%
103951 requests in 10.10s, 30.34MB read
Socket errors: connect 0, read 103947, write 0, timeout 0
Requests/sec: 10292.48
Transfer/sec: 3.00MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 846.32us 2.63ms 58.29ms 98.26%
Req/Sec 2.64k 0.94k 11.13k 72.89%
102661 requests in 10.10s, 29.96MB read
Socket errors: connect 0, read 102658, write 0, timeout 0
Requests/sec: 10165.46
Transfer/sec: 2.97MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 8.57ms 90.07ms 1.66s 98.96%
Req/Sec 3.71k 2.87k 11.73k 74.30%
105162 requests in 10.10s, 30.69MB read
Socket errors: connect 0, read 105159, write 0, timeout 2
Requests/sec: 10412.91
Transfer/sec: 3.04MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 5.69ms 70.32ms 1.66s 99.25%
Req/Sec 3.06k 1.82k 9.46k 74.40%
101302 requests in 10.10s, 29.56MB read
Socket errors: connect 0, read 101299, write 0, timeout 3
Requests/sec: 10030.14
Transfer/sec: 2.93MB
inaimathi@this:~/quicklisp/local-projects/house$
Aaand it looks like the effect was neglegible. Oh well. I honestly think that the untangling we've done so far makes the parts of the codebase that its' touched more readable, so I'm keeping them, but there's no great improvement yet. Perhaps if we inline some things?
;;; package.lisp
-(declaim (inline crlf write-ln idling? flex-stream))
+(declaim (inline crlf write-ln idling? flex-stream write-response! write-sse! process-ready parse-param-string parse-request-string))
wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.71ms 15.37ms 412.51ms 98.91%
Req/Sec 2.69k 0.91k 6.28k 65.37%
103607 requests in 10.10s, 30.24MB read
Socket errors: connect 0, read 103603, write 0, timeout 0
Requests/sec: 10258.44
Transfer/sec: 2.99MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 837.49us 2.66ms 58.36ms 98.36%
Req/Sec 2.63k 836.52 3.81k 49.37%
103449 requests in 10.10s, 30.19MB read
Socket errors: connect 0, read 103446, write 0, timeout 0
Requests/sec: 10242.91
Transfer/sec: 2.99MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 6.23ms 74.76ms 1.89s 99.08%
Req/Sec 4.01k 2.20k 10.23k 58.89%
101524 requests in 10.10s, 29.63MB read
Socket errors: connect 0, read 101522, write 0, timeout 4
Requests/sec: 10052.56
Transfer/sec: 2.93MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 5.75ms 70.98ms 1.67s 99.27%
Req/Sec 3.19k 2.11k 10.26k 81.39%
100944 requests in 10.01s, 29.46MB read
Socket errors: connect 0, read 100941, write 0, timeout 1
Requests/sec: 10088.23
Transfer/sec: 2.94MB
Again, no huge difference. On closer inspection, lookup
is only used in one place internally, and it's easy to replace with gethash
so I'm just going to do that and re-check real quick.
;;; channel.lisp
...
- (push sock (lookup channel *channels*))
+ (push sock (gethash channel *channels*))
...
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.95ms 3.72ms 72.70ms 98.43%
Req/Sec 2.66k 1.00k 11.52k 73.45%
102839 requests in 10.10s, 30.01MB read
Socket errors: connect 0, read 102835, write 0, timeout 0
Requests/sec: 10183.46
Transfer/sec: 2.97MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.87ms 2.85ms 59.32ms 98.19%
Req/Sec 2.62k 0.86k 3.87k 54.82%
102818 requests in 10.10s, 30.00MB read
Socket errors: connect 0, read 102814, write 0, timeout 0
Requests/sec: 10180.62
Transfer/sec: 2.97MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 6.96ms 80.03ms 1.68s 99.10%
Req/Sec 3.11k 2.12k 11.72k 78.40%
105460 requests in 10.10s, 30.78MB read
Socket errors: connect 0, read 105456, write 0, timeout 5
Requests/sec: 10441.77
Transfer/sec: 3.05MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 8.22ms 83.95ms 1.66s 98.84%
Req/Sec 3.19k 2.07k 11.66k 73.23%
103933 requests in 10.10s, 30.33MB read
Socket errors: connect 0, read 103930, write 0, timeout 5
Requests/sec: 10290.43
Transfer/sec: 3.00MB
To no ones' great surprise, still not much of a difference. I'm going to let the lookup
issue dangle for the moment, because it has to do with a trick I want to pull a bit later on, but before we get to that...
The second step is to kill class
definitions entirely. Their accessor
functions are also generic, and therefore rely on method dispatch. struct
s are a bit clumsier, but probably faster in the end. Now, we can't really mess with session
, request
and response
, because those are part of house
s' external interface, but there's three places where we can replace defclass
with defstruct
.
Re-writing buffer
, sse
and handler-entry
...
;;; model.lisp
...
-(defclass sse ()
- ((id :reader id :initarg :id :initform nil)
- (event :reader event :initarg :event :initform nil)
- (retry :reader retry :initarg :retry :initform nil)
- (data :reader data :initarg :data)))
...
-(defclass buffer ()
- ((tries :accessor tries :initform 0)
- (contents :accessor contents :initform nil)
- (bi-stream :reader bi-stream :initarg :bi-stream)
- (total-buffered :accessor total-buffered :initform 0)
- (started :reader started :initform (get-universal-time))
- (request :accessor request :initform nil)
- (expecting :accessor expecting :initform 0)))
...
-(defclass handler-entry ()
- ((fn :reader fn :initarg :fn :initform nil)
- (closing? :reader closing? :initarg :closing? :initform t)))
...
;;; house.lisp
...
-(defun write-sse! (res stream)
- (format stream "~@[id: ~a~%~]~@[event: ~a~%~]~@[retry: ~a~%~]data: ~a~%~%"
- (id res) (event res) (retry res) (data res)))
...
-(defun buffer! (buffer)
- (handler-case
- (let ((stream (bi-stream buffer)))
- (incf (tries buffer))
- (loop for char = (read-char-no-hang stream)
- until (or (null char) (eql :eof char))
- do (push char (contents buffer))
- do (incf (total-buffered buffer))
- when (request buffer) do (decf (expecting buffer))
- when (and #-windows(char= char #\linefeed)
- #+windows(char= char #\newline)
- (line-terminated? (contents buffer)))
- do (multiple-value-bind (parsed expecting) (parse-buffer buffer)
- (setf (request buffer) parsed
- (expecting buffer) expecting
- (contents buffer) nil)
- (return char))
- when (> (total-buffered buffer) +max-request-size+) return char
- finally (return char)))
- (error () :eof)))
...
-(defun parse-buffer (buf)
- (let ((str (coerce (reverse (contents buf)) 'string)))
- (if (request buf)
- (if (eq :application/json (->keyword (cdr (assoc :content-type (headers (request buf))))))
- (cl-json:decode-json-from-string str)
- (parse-param-string str))
- (parse-request-string str))))
...
;;; define-handler.lisp
+(defstruct handler-entry
+ (fn nil)
+ (closing? t))
...
- (make-instance
- 'handler-entry
+ (make-handler-entry
;;; channel.lisp
...
+(defstruct (sse (:constructor make-sse (data &key id event retry)))
+ (id nil) (event nil) (retry nil)
+ (data (error "an SSE must have :data") :type string))
...
-(defun make-sse (data &key id event retry)
- (make-instance 'sse :data data :id id :event event :retry retry))
+(defun write-sse! (res stream)
+ (format stream "~@[id: ~a~%~]~@[event: ~a~%~]~@[retry: ~a~%~]data: ~a~%~%"
+ (ss-id res) (sse-event res) (sse-retry res) (sse-data res)))
...
;;; buffer.lisp
+(in-package :house)
+
+(defstruct (buffer (:constructor make-buffer (bi-stream)))
+ (tries 0 :type integer)
+ (contents nil)
+ (bi-stream nil)
+ (total-buffered 0 :type integer)
+ (started (get-universal-time))
+ (request nil)
+ (expecting 0 :type integer))
+
+(defun buffer! (buffer)
+ (handler-case
+ (let ((stream (buffer-bi-stream buffer)))
+ (incf (buffer-tries buffer))
+ (loop for char = (read-char-no-hang stream)
+ until (or (null char) (eql :eof char))
+ do (push char (buffer-contents buffer))
+ do (incf (buffer-total-buffered buffer))
+ when (buffer-request buffer) do (decf (buffer-expecting buffer))
+ when (and #-windows(char= char #\linefeed)
+ #+windows(char= char #\newline)
+ (line-terminated? (buffer-contents buffer)))
+ do (multiple-value-bind (parsed expecting) (parse-buffer buffer)
+ (setf (buffer-request buffer) parsed
+ (buffer-expecting buffer) expecting
+ (buffer-contents buffer) nil)
+ (return char))
+ when (> (buffer-total-buffered buffer) +max-request-size+) return char
+ finally (return char)))
+ (error () :eof)))
+
+(defun parse-buffer (buf)
+ (let ((str (coerce (reverse (buffer-contents buf)) 'string)))
+ (if (buffer-request buf)
+ (if (eq :application/json (->keyword (cdr (assoc :content-type (headers (buffer-request buf))))))
+ (cl-json:decode-json-from-string str)
+ (parse-param-string str))
+ (parse-request-string str))))
... should get us _something. Right?
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.09ms 6.18ms 202.73ms 98.55%
Req/Sec 2.69k 0.89k 4.02k 56.74%
105108 requests in 10.10s, 30.67MB read
Socket errors: connect 0, read 105105, write 0, timeout 0
Requests/sec: 10406.92
Transfer/sec: 3.04MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 10 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.98ms 5.78ms 204.47ms 98.86%
Req/Sec 2.67k 848.77 3.98k 54.71%
104242 requests in 10.10s, 30.42MB read
Socket errors: connect 0, read 104242, write 0, timeout 0
Requests/sec: 10321.40
Transfer/sec: 3.01MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 6.93ms 79.75ms 1.66s 99.10%
Req/Sec 3.33k 2.46k 11.95k 79.87%
105920 requests in 10.10s, 30.91MB read
Socket errors: connect 0, read 105918, write 0, timeout 2
Requests/sec: 10487.59
Transfer/sec: 3.06MB
inaimathi@this:~/quicklisp/local-projects/house$ wrk -c 100 -t 4 -d 10 http://127.0.0.1:5000
Running 10s test @ http://127.0.0.1:5000
4 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 4.78ms 61.11ms 1.68s 99.30%
Req/Sec 2.83k 1.26k 7.01k 70.22%
103381 requests in 10.10s, 30.17MB read
Socket errors: connect 0, read 103378, write 0, timeout 0
Requests/sec: 10235.14
Transfer/sec: 2.99MB
Very little noticeable gain, I'm afraid. Ok, there's one more thing I'm tempted to try. There were hints earlier that this was coming, including this, but if you don't follow my github
you might still be surprised.
Now that we have what I think is a reasonably fast implementation of house
, I want to see whether2 [clj
](https://github.com/inaimathi/clj) does performance damage to the implementation. I want to see this because, the clj
datastructures and syntax really improve readability and REPL
development; there's a bunch of situations in which I missed having that level of visibility into my structures before I even began this benchmark article. There's even probably a few places where it saves some performance by referencing other partial structures. The problem is that I'm guessing it's a net negative in terms of performance, so I want to see what a conversion would do to my benchmark before I go through with it.
This is going to be especially useful for house
s' external interface. And given that I've already had to break compatibility to write this overhaul, this is probably the best possible time to test the theory. The trouble is that I'm not entirely sure what the real interface looks like quite yet, so I'm not going to be implementing it today. These are just some musings.
The current house
model for handler
/response
interaction is that a handler returns either a response
(in the event of a redirect!
) or a string
(in any other event). This makes a few things kind of difficult. Firstly, it means that session
and header
manipulation has to happen by effect. That is, they're not included as part of the return value; they have to be exposed in some other way. In the case of headers
, it's via an alist
bound to the invisible symbol headers
inside of the handler body. This ... is less than ideal.
If we take the http-kit
approach, we'd expect our handlers to always return a map
. And if that map
had slots for headers
/session
, those things would be set as appropriate in the outgoing response
and/or server state. Our input would also be a map
. And it would naturally contain method
/headers
/path
/parameters
/session
/etc slots that a handler writer would want to make use of. I'm not entirely clear on whether we'd want to make this the primary internal and external representation, or if we're just looking for an easily manipulated layer for the users. I'm leaning towards the first of those options.
This ... actually doesn't sound too hard if cut at the right level. Lets give it a shot, I guess.
It wasn't.
There's enough weird shit happening here that I need a fresh brain for it. That was enough for now. The main roadblock I hit is that it turns out that a lot more of the internal interface here depends on mutation than I thought. This is bad for readability and coceptual simplicity, but good in the sense that I can move away from these models first, then see about integrating clj
later.
I'll probably take another run up this hill later, but for now, I think I'm moving on to other issues.
etypecase
is the right way to go. But if you want the callers of your code to be able to define new behaviors for datastructures they specify themselves, then absolutely reach for defmethod
.↩via Planet Lisp by on Mon, 21 Dec 2020 01:27:00 GMT
New projects:
Updated projects: 3bmd, 3bz, 3d-matrices, 3d-vectors, adopt, algae, april, arc-compat, architecture.builder-protocol, array-utils, arrow-macros, aws-sign4, bdef, binpack, check-bnf, cl-ana, cl-ansi-text, cl-bunny, cl-catmull-rom-spline, cl-cffi-gtk, cl-collider, cl-conllu, cl-covid19, cl-custom-hash-table, cl-digraph, cl-environments, cl-gamepad, cl-gd, cl-glfw3, cl-gserver, cl-interpol, cl-kraken, cl-liballegro, cl-liballegro-nuklear, cl-libyaml, cl-lzlib, cl-markless, cl-maxminddb, cl-mime, cl-mixed, cl-mongo-id, cl-naive-store, cl-octet-streams, cl-pass, cl-patterns, cl-pdf, cl-portaudio, cl-prevalence, cl-randist, cl-rdkafka, cl-sdl2, cl-sdl2-mixer, cl-semver, cl-sendgrid, cl-setlocale, cl-skkserv, cl-steamworks, cl-str, cl-tcod, cl-telegram-bot, cl-unicode, cl-utils, cl-wavelets, cl-webkit, cl-yaml, clesh, clj, clml, closer-mop, clsql, clweb, colored, common-lisp-jupyter, concrete-syntax-tree, conduit-packages, consix, corona, croatoan, curry-compose-reader-macros, dartscltools, dartscluuid, data-lens, defclass-std, deploy, dexador, djula, docparser, doplus, easy-audio, easy-routes, eazy-documentation, eclector, esrap, file-select, flexichain, float-features, floating-point-contractions, functional-trees, gadgets, gendl, generic-cl, glacier, golden-utils, gtirb-capstone, harmony, helambdap, house, hunchentoot-multi-acceptor, hyperluminal-mem, imago, ironclad, jingoh, jpeg-turbo, jsonrpc, kekule-clj, linear-programming, linux-packaging, lisp-chat, lisp-critic, lisp-gflags, literate-lisp, lmdb, local-package-aliases, local-time, lquery, markup, math, mcclim, millet, mito, mmap, mutility, named-readtables, neo4cl, nibbles, num-utils, origin, orizuru-orm, parachute, pathname-utils, perceptual-hashes, petalisp, phoe-toolbox, physical-quantities, picl, pjlink, portable-condition-system, postmodern, prometheus.cl, protest, protobuf, py4cl, py4cl2, qt-libs, quilc, quri, rcl, read-number, reader, rpcq, rutils, s-graphviz, sc-extensions, secret-values, sel, select, serapeum, shadow, simple-parallel-tasks, slime, sly, snooze, static-dispatch, stmx, stumpwm, swank-client, swank-protocol, sxql, tesseract-capi, textery, tooter, trace-db, trivial-compress, trivial-do, trivial-pooled-database, trivial-string-template, uax-15, uncursed, verbose, vp-trees, weblocks-examples, weblocks-prototype-js.
Removed projects: cl-arrows, cl-generic-arithmetic, clcs-code, dyna, osmpbf, sanity-clause, unicly.
To get this update, use (ql:update-dist "quicklisp")
Enjoy!
via Planet Lisp by on Sun, 20 Dec 2020 18:18:11 GMT
Quicklisp statistics are now available as CSV files, and the Quicklisp Stats system that I've just submitted to Quicklisp is a little helper library for handling this dataset and accessing it from inside Lisp.
Examples:
;;; How many times was Alexandria downloaded in Nov 2020?
QUICKLISP-STATS> (system-downloads :alexandria 2020 11)
13731
;;; Get all systems that were downloaded
;;; more than 10000 times in Nov 2020
;;; and print them somewhat nicely
QUICKLISP-STATS> (loop with stats = (month 2020 4)
with filtered-stats
= (remove-if-not (lambda (x) (< 10000 (cdr x))) stats)
for (system . count) in filtered-stats
do (format t ";; ~20A : ~5D~%" system count))
;; alexandria : 19938
;; cl-ppcre : 15636
;; bordeaux-threads : 14974
;; trivial-features : 14569
;; split-sequence : 14510
;; closer-mop : 14482
;; trivial-gray-streams : 14259
;; babel : 14254
;; cffi : 12365
;; flexi-streams : 11940
;; iterate : 11924
;; named-readtables : 11205
;; cl-fad : 10996
;; usocket : 10859
;; anaphora : 10783
;; trivial-backtrace : 10693
NIL
;;; How many downloads did Bordeaux Threads
;;; have over all of 2020?
QUICKLISP-STATS> (loop for ((year month) . data) in (all)
for result = (a:assoc-value data "bordeaux-threads"
:test #'equal)
do (format t ";; ~4,'0D-~2,'0D: ~D~%" year month result))
;; 2020-01: 16059
;; 2020-02: 12701
;; 2020-03: 17123
;; 2020-04: 14974
;; 2020-05: 14489
;; 2020-06: 13851
;; 2020-07: 14130
;; 2020-08: 10843
;; 2020-09: 13757
;; 2020-10: 13444
;; 2020-11: 15825
NIL
via Planet Lisp by on Sat, 19 Dec 2020 21:59:54 GMT
This is yet another documentation generator for Common Lisp, built by Masataro Asai.
Its unique feature is the documentation processor which is able to extract docstring from nonstandard Lisp forms. Also, it supports all markups, supported by Pandoc and can be used to generate documentation from any folder.
You'll find more pros and cons in the template repository I've prepared for you.
Despite many cool features, I have these stoppers for using Eazy Documentation for my own projects:
MGL-PAX, reviewed recently is still my favourite.
But Eazy Documentation still can be useful when:
via Planet Lisp by on Sat, 19 Dec 2020 00:00:00 GMT
A few remarks about the manardb.via Planet Lisp by on Sat, 12 Dec 2020 15:03:00 GMT
Common Lisp
programmers may write
many with-something
overt their careers; the
language specification itself is ripe with such constructs: witness with-open-file
. Many other libraries also
introduce a slew of with- macros dealing with this or that case.
So, if this is the case, what prevents Common
Lisp programmers from coming up with a
generalized with
macro?
It appears that the question has been answered, rather satisfactorily, in Python and Julia (at least). Python offers the with statement, alongside a library of "contexts" (Python introduced the with statement in 2005 with PEP 343) and Julia offers its do blocks.
In the following I will present WITH-CONTEXTS, a Common Lisp answer to the question. The library is patterned after the ideas embodied in the Python solution, but with several (common) "lispy" twists.
Here is the standard - underwhelming - example:
(with f = (open "foo.bar") do (do-something-with f))
That's it as far as syntax is concerned
(the 'var =
' being optional, obviously
not in this example; the syntax was chosen to
be loop-like, instead of using
Python's as keyword). Things become more
interesting when you look under the hood.
Traditional Common Lisp with- macros expand in variations of unwind-protect or handle-case (and friends). The example above, if written with with-open-file would probably expand into something like the following:
(let ((f nil)) (unwind-protect (progn (setq f (open "foo.bar")) (do-something-with f)) (when f (close f))))
Python generalizes this scheme by introducing a enter/exit protocol that is invoked by the with statement. Please refer to the Python documentation on contexts and their __enter__ and __exit__ methods.
In order to introduce a with macro in Common Lisp that mimicked what Python programmers expect and what Common Lisp programmers are used to some twists are necessary. To achieve this goal, a protocol of three generic functions is provided alongside a library of contexts.
The WITH-CONTEXTS library provides three generic functions that are called at different times within the code resulting from the expansion of the onvocation of the with macro.
with
macro.
Given the protocol (from now on referred to as the "EHE-C protocol"), the (undewhelming) "open file" example expands in the following:
(let ((f nil)) (unwind-protect (progn (setq f (enter (open "contexts.lisp"))) (handler-case (open-stream-p f) (error (#:ctcx-err-e-41883) (handle f #:ctcx-err-e-41883)))) (exit f)))
Apart from the gensym
med variable the expansion is
pretty straightforward. The function enter is
called on the newly opened stream (and is essentially an identity
function) and sets the variable. If some error happens while the
body of the macro is executing then control is passed to
the handle function (which, in its most basic form
just re-signals the condition). Finally, the unwind-protect has a
chance to clean up by calling exit (which, when
passed an open stream, just closes it).
One unexpected behavior for Common Lisp programmers is that the variable (f in the case above) escapes the with constructs. This is in line with what Python does, and it may have its uses. The file opening example thus has the following behavior:
CL-prompt > (with f = (open "contexts.lisp") do
(open-stream-p f))
T
CL-prompt > (open-stream-p f)
NIL
To ensure that this behavior is reflected in the implementation, the actual macroexpansion of the with call becomes the following.
(let ((#:ctxt-esc-val-41882 nil)) (multiple-value-prog1 (let ((f nil)) (unwind-protect (progn (setq f (enter (open "contexts.lisp"))) (handler-case (open-stream-p f) (error (#:ctcx-err-e-41883) (handle f #:ctcx-err-e-41883)))) (multiple-value-prog1 (exit f) (setf #:ctxt-esc-val-41882 f)))) (setf f #:ctxt-esc-val-41882)))
This "feature" will help in - possibly - porting some Python code to Common Lisp.
Python attaches to the with statement the notion of contexts. In Common Lisp, far as the with macro is concerned, anything that is passed as the expression to it, must respect the enter/handle/exit. protocol. The three generic functions enter, handle, exit, have simple defaults that essentially let everything "pass through", but specialized context classes have been defined that parallel the Python context library classes.
First of all, the current library defines the EHE-C protocol for streams. This is the strightforward way to obtain the desired behavior for opening and closing files as with with-open-file.
Next, the library defines the following "contexts" (as Python does).
This should be a good enough base to start working with contexts in Common Lisp. It is unclear whether the Python decorator interface would provide some extra functionality in this Common Lisp implementation of contexts and the with macro.
The current implementation has a semantics that is obviously not the same as the corresponding Python one, but it is hoped that it still provided useful functionality. There are some obvious limitations that should be taken into account.
The current implementation of the library does not take into consideration threading issues. It could, by providing a locking-context based on a portable multiprocessing API (e.g., bordeaux-threads).
The Python implementation of contexts relies heavily on the yield statement. Again, the current implementation does not provide similar functionality, although it could possibly be implemented using a delimited continuation library (e.g., cl-cont).
The code associated to these documents is not completely tested and it is bound to contain errors and omissions. This documentation may contain errors and omissions as well. Moreover, some design choices are recognized as sub-optimal and may change in the future.
The file COPYING that accompanies the library contains a Berkeley-style license. You are advised to use the code at your own risk. No warranty whatsoever is provided, the author will not be held responsible for any effect generated by your use of the library, and you can put here the scariest extra disclaimer you can think of.
The with-contexts library is available on Quicklisp (not yet).
The with-contexts library. is hosted at common-lisp.net.
The git repository can be gotten from the common-lisp.net Gitlab instance in the with-macro project page.
(cheers)
via Planet Lisp by on Fri, 11 Dec 2020 21:33:23 GMT
In this tutorial I explain how to start using classes
in Common Lisp, it is mostly focused on learning about slots
(properties), how to use them, what options are available on slots
and how to initialise
a class
.
Common Lisp Tutorial 10b: Basic Classes
A simple class
can be created with the defclass macro
:
1
2
(defclass person ()
(name age))
It can be initialised
with the following code, please be aware however that one does not use new
or some factory-pattern
named function
to build an instance
, Common Lisp has a different way, make-instance:
1
(make-instance 'person)
It is possible to get started with code this simple, using the slot-value
function with setf
to get/set the values stored in the slots:
(defclass person ()
(name age))
(let ((p (make-instance 'person)))
(setf (slot-value p 'name) 'bob)
(setf (slot-value p 'age) 24)
(format nil "~A: ~A" (slot-value p 'name) (slot-value p 'age)))
Alternatively one can also use with-slots
to achieve the same result, the slot names are setf
-able and can be read and written to easily!
(defclass person ()
(name age))
(let ((p (make-instance 'person)))
(with-slots (name age) p
(setf name 'bob)
(setf age 28)
(format nil "~A: ~A" name age)))
There's a lot more one can do with classes though, in fact there are 8 options that can be passed to a slot, each extend the behavior in useful ways and are listed below:
A previous version of this article incorrectly claimed there was no way to get/set the slots.
The initarg option is used to set the value of slots
at class
initilisation
, you do not have to use the same keyword as the slot name!
1
2
3
4
5
6
7
(defclass person ()
((name :initarg :name)))
; When you create an object, you can set the slot value like so
(let ((p (make-instance 'person :name "Fred")))
(with-slots (name) p
(format t "~A~%" name)))
The initform option is used to set the default value of slots
at class
initilisation
, if no value is given.
1
2
3
4
5
6
7
(defclass person ()
((name :initform "Fred")))
; When you create an object, you can set the slot value like so
(let ((p (make-instance 'person)))
(with-slots (name) p
(format t "~A~%" name)))
The reader option allows you to have a function created for you to access the value stored in a slot
. It is worth noting you can have as many :reader
options as you like!
1
2
3
4
5
6
(defclass person ()
((name :initarg :name :reader name)))
; You can then use the function like so
(let ((p (make-instance 'person)))
(format t "~A~%" (name p)))
The writer option allows you to have a function created for you to change the value stored in a slot
. It is worth noting you can have as many :writer
options as you like!
1
2
3
4
5
6
7
(defclass person ()
((name :initarg :name :reader name :writer set-name)))
; You can then use the function like so
(let ((p (make-instance 'person)))
(set-name "Fred" p)
(format t "~A~%" (name p)))
A setf-able function that can be used to both read and write to the slot
of a class instance
.
1
2
3
4
5
6
(defclass person ()
((name :initarg :name :accessor name)))
(let ((p (make-instance 'person)))
(setf (name p) "Fred")
(format t "~A~%" (name p)))
Determines if a slot
exists on the class
directly and is therefore shared amonst all instances
or if the slot
is unique to each instance, the two options to allocation are :class
or :instance
. By default slots
are allocated to :instance
and not :class
.
1
2
3
4
5
6
7
8
(defclass person ()
((name :initarg :name :allocation :instance :accessor name)
(species :initform "human" :allocation :class :accessor species)))
(let ((p (make-instance 'person :name "Fred"))
(p1 (make-instance 'person :name "Bob")))
(setf (species p1) "not human")
(format t "~A: ~A~%" (name p) (species p)))
The documentation option is to assist the programmer understand the purpose of a slot
. Forgive such a trivial example below as what a name slot
on a person object
is going to be is pretty self-evident, but in other cases maybe not so much.
1
2
(defclass person ()
((name :documentation "The persons name")))
The type option is another hint to programmers, it is important to note that despite appearances it is not an enforced type, it confused me at first but it's just a hint, alongside :documentation
.
zellerin has very kindly corrected this particular section, thank you!
To quote the HyperSpec
The :type slot option specifies that the contents of the slot will always be of the specified data type. It effectively declares the result type of the reader generic function when applied to an object of this class. The consequences of attempting to store in a slot a value that does not satisfy the type of the slot are undefined. The :type slot option is further discussed in Section 7.5.3 (Inheritance of Slots and Slot Options).
So be warned, this is not a hint to programmers, it is a promise to the compiler, and if you break that promise, anything can happen. This means that :type
is more than a hint to programmers!
It is possible to see how to enforce the use of types throws a type error using locally safety optimized code like so:
1
2
3
(locally (declare (optimize (safety 3)))
(defclass foo () ((a :initarg :a :type integer)))
(make-instance 'foo :a 'a))
1
2
(defclass person ()
((name :type string)))
The code from the video is listed here for your convenience.
1
2
3
4
5
6
7
8
9
10
(defclass person ()
((name :initarg :name :initform "Bob" :accessor name :allocation :instance :type string :documentation "Stores a persons name")
(age :initarg :age :initform 18 :accessor age :allocation :instance :type integer :documentation "Stores a persons age")
(species :initarg :species :initform "human" :accessor species :allocation :class)))
(let ((p1 (make-instance 'person :name 145)))
(setf (species p1) "not-human")
(let ((p2 (make-instance 'person :name "Fred" :age 34)))
(format nil "~A: ~A (~A)" (name p2) (age p2) (species p2))))
via Planet Lisp by on Mon, 07 Dec 2020 16:06:01 GMT
Well, I am already slowly starting to get back into coding me some Lisp games. There just isn't much else to do in my free time in this current global health crisis.
For the last week, I have been mostly scribbling notes on my reMarkable about ways to fix the engine troubles discussed in the last couple of articles. I have a few solutions that look really good on paper, so I'm just starting to explore them in code.
While it isn't that difficult of a problem to solve, the difficulty is in retrofitting the existing engine -- it'd be far too much work to solve that, both due to its size and complexity, as well as just the code quality in general with zero unit or integration tests.
For that reason, I am going to begin working on a new engine, that will share a lot of ideas with the previous, but will infact be rewritten from the ground up, with a better architecture and proper tests every step of the way. I'm not going to say much about the new design or what's different until I am confident enough in it, but what I worked out was a way to use structure-objects and arrays in the performance-sensitive areas that were previously using standard-objects and hash tables.
As the project progresses into more than just an idea, I will publish the code on my GitHub as usual. I just wanted to mention that I'm happy to be back, although I am taking precautions as to not get so burnt out again.
via Planet Lisp by on Sun, 06 Dec 2020 19:10:00 GMT
At the beginning of the pandemic I stumbled upon an article regarding the problems that the State of New Jersey was having in issuing relief checks and funding due to the lack of ... COBOL programmers. At the time I followed a couple of links, landing on this "Hello World on z/OS" blog post. I was curious and obviously looking for something other than my usual day job, plus, I swear, I had never written some COBOL code.
What follows is a report of the things I learned and how I solved them. If you are easily bored, just jump to the end of this (long) post to check out the IRON MAIN Emacs Lisp package.
Well, to make a long story short, I eventually installed the Hercules emulator (and other ones - more on this maybe later) in its SDL/Hyperion incarnation and installed MVS on it; the versions I installed are TK4- and a "Jay Moseley" build (special thanks to Jay, who is one of the most gracious and patient people I interacted with over the Internet). I also installed other "big iron" OSes, e.g., MTS, on the various emulators and experimented a bit (again, maybe I will report on this later).
It has been a lot of fun, and I discovered a very lively (if grizzled) community of enthusiasts, who mostly gathers around a few groups.io groups, e.g., H390-MVS. The community is very helpful and, at this point, very similar, IMHO, to the "Lisp" communities out there, if you get my drift.
One way to interact with "the mainframe" (i.e., MVS running on Hercules) is to write your JCL in your host system (Linux, Windows, Mac OS) and then to submit it to a simulated card reader listening over a socket (port 3505, which is meaningful to the IBM mainframe crowd). JCL code is interesting, as is the overall forma mentis that is required to interact with the mainframe, especially for somebody who was initially taught UNIX, saw some VMS and a few hours of Univac Exec 8. In any case, you can write your JCL, where you can embed whole Assembler, COBOL, Fortran, PL/I etc code, using some editor on Windows, Linux or Mac OS etc.
Of course, Lisp guys know that there is one Editor, with its church. So, what one does is to list-all-packages and install jcl-mo... Wait...
To the best of my knowledge, as of December 2020, there is no jcl-mode to edit JCL code in Emacs.
It immediately became a categorical imperative to build one, which I did, while learning a bit of Emacs Lisp, that is, all the intricacies of writing modes and eventually post them on MELPA.
Writing a major mode for Emacs in 2020 is simple in principle, but tricky in practice, especially, if, like me, you start with only a basic knowledge of the system as a user.
One starts with define-derived-mode and, in theory, things should be relatively easy from there on. The first thing you want to do is to get your font-lock-mode specifications right. Next you want to add some other nice visual tools to your mode. Finally you want to package your code to play nice with the Emacs ecosystem.
Font Lock mode (a minor mode) does have some quirks that make it a bit difficult to understand without in depth reading of the manual and of the (sparse) examples one finds over the Internet. Of course, one never does enough RTFM, but I believe a few key points should be reported here.
Font Lock mode does two "fontification" operations/passes. At least this seem the way to interpret them.
To interact with Font Lock, a mode must eventually set the variable font-lock-defaults. The specification of the object contained in this variable is complicated. This variable is eventually a list with at least one element (the "keywords"); the optional second one controls whether the syntax table pass (2) is performed or not. I found that the interaction between the first two elements must be carefully planned. Essentially you must decide whether you want only the search based ("keyword") fontification or the syntax table based (2) fontification too.
If you do not want the syntax table based (2) fontification then you want to have the second element of font-lock-defaults set to non-NIL.
The first element of font-lock-defaults is where most of the action is. Eventually it becomes the value of the variable font-lock-keywords that Font Lock uses to perform search based fontification (1). The full range of values that font-lock-keywords may assume is quite rich; eventually its structure is just a list of "fontificators". There are two things to note however, which I found very useful.
First, Font Lock applies each element of font-lock-keywords (i.e., (first font-lock-defaults)) in order. This means that a certain chunk of text may be fontified more than once. Which brings us to the second bit of useful information.
Each element that eventually ends up in font-lock-keywords may have the form
(matcher . subexp-highlighter)
where subexp-highligther = (subexp facespec [override [laxmatch]])
(see the full documentation for more details).
Fontification is not applied to chunks of text that have already been fontified, unless override is set to non-NIL. In this case the current fontification is applied. This is very important for things like strings and comments, which may interact in unexpected ways, unless you are careful with the order of font-lock-keywords.
I suggest you download and use the wonderful library font-lock-studio by Anders Lindgren to debug your Font Lock specifications.
When you write lines, pardon, cards for MVS or z/OS it is nice to have a ruler to count on that tells you at what column you are (and remember that once you hit column 72 you'd better... continue). Emacs has a built in nice little utility that does just that: a minor mode named ruler-mode, which shows a ruler in the top row of your buffer.
There is a snag.
Emacs counts columns from 0. MVS, z/OS and friends count columns from 1. Popping up the ruler of ruler-mode in a buffer containing JCL (or COBOL, or Fortran) shows that you are "one off": not nice.
Almost luckily, in Emacs 27.x (which is what I am using) you can control this behavior using the variable column-number-indicator-zero-based, which is available when you turn on the minor mode column-number-mode. Its default is t, but if you set it to nil, the columns in the buffer will start at 1, which is "mainframe friendly". Alas, this change does not percolate (yet - it needs to be fixed in Emacs) to ruler-mode, which insists on counting from 0.
End of story: some - very minor - hacking was needed to fix the rather long "ruler function" to convince it to count columns from 1.
Is there a good way to do this?
It appears that most Emacs "packages" are one-file affairs. The package I wrote needs to be split up in a few files, but it is unclear (remember that I never do enough RTFM) how to keep thinks together for distribution, e.g., on MELPA or, more simply in your Emacs load-path.
What I would like to achieve is to just do a load (or a load-library) of a single file that caused the loading of the other bits and pieces. It appears that Emacs Lisp does not have an ASDF or a MK:DEFSYSTEM as you have in Common Lisp (I will be glad to be proven wrong), so, as my package is rather small after all, I resorted to writing a main file that is named after the library and which can be thus referenced in the -pkg.el file that Emacs packaging requires. I could have used use-package, but its intent appear to be dealing with packages that are already "installed" in your Emacs environment.
MELPA comes with it recipes format to register your package; it is a description of your folder structure and it is useful, but it is something you need to submit separately to the main site, let me add, in a rather cumbersome way. Quicklisp is far friendlier.
One other rant I have with the Emacs package distribution sites (e.g., MELPA and El-Get) is that eventually they assume you are on UN*X (Linux) and require you to have installed bits and pieces of the traditional UN*X toolchain (read: make) or worse. I am running on W10 these days and there must be a better way.
Bottom line: I created a top file (iron-main.el) which just sets up a few things and requires and/or loads the other files that are part of or needed by the package. One of the files contains the definition of a minor mode called iron-main-mode (in an eponymous .el file).
I am wondering whether this is the best way of doing things in Emacs Lisp. Please tell me in the comments section.
At the end of the story, here is the link to the GitHub repository for the IRON MAIN Emacs package to interact with the mainframe.
As you see the package is rather simple.
It is essentially three files plus the "main" one and a few ancillary ones.
One of the nice things I was able to include in jcl-mode is the ability to submit the buffer content (or another .jcl file, pardon, dataset) to the mainframe card reader listening on port 3505 (by default, assuming such a card reader has been configured).
This turns out to be useful, because it allows you to avoid using netcat, nc.exe or nc64.exe, which, at least on W10, always trigger Windows Defender. Plus everything remains integrated with Emacs. Remember: there's an Emacs command for that!
To conclude here are two screenshots (one "light", one "dark") of a test JCL included in the release. Submitting it form Emacs to TK4- and to a "Jay Moseley's build" seems to work pretty well. Just select the Submit menu under JCL OS or invoke the submit function via M-x.
What's next? A few things apart from cleaning up, like exploring polymode; after all, embedding code in JCL is not unheard of.
That's it. It has been fun and I literally learned a lot of new things. Possibly useful.
If you are a mainframe person, do jump on the Emacs bandwagon. Hey, you may want to write a ISPF editor emulator for it 😏😄
(cheers)
MA
via Zach Beane Common Lisp by on Thu, 26 Nov 2020 17:45:55 GMT