LispLanguage on graphics card
Quora Answer : How can use my favorite languages in a GPU?
Short answer. You (or someone else) will have to write a compiler from PicoLisp (or subset of PicoLisp) to whatever runs on / takes advantage of the GPU. That's probably some C variant that works with the CUDA platform.
The good news is that this isn't impossible. There ARE Lisp-to-C compilers. Lisps (like other FP languages) can provide a good model for parallel programming.
But you'll probably find that some aspects of PicoLisp would be hard to implement. And that may include attractive things about the storage and Prolog engine etc.
Whether anyone has done this yet, I have no idea.
Update : there seem to be Common Lisp CUDA bindings : takagi/cl-cuda