Replies: 1 comment 2 replies
-
Adding LaTeX rendering support would be nice. I am not sure that we can bundle an entire python interpreter with it, but generally we should try to make a full-featured UI. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The existence of pyodide has come to my attention. This allows for executing python code inside the browser, client-side.
"Pyodide makes it possible to install and run Python packages in the browser with micropip. Any pure Python package with a wheel available on PyPI is supported. Many packages with C extensions have also been ported for use with Pyodide. These include many general-purpose packages such as regex, pyyaml, lxml and scientific Python packages including numpy, pandas, scipy, matplotlib, and scikit-learn."
So both core and a decent set of common, non-core modules are available as WASM. And pure python modules can also be installed.
Being able to execute python directly in the browser permits student to quickly test code/syntax. And lowers the bar for teachers and students alike to get started. But it also to permits the LLM to illustrate mathematical concepts in a way which aids comprehension of otherwise textual information. (using matplotlib, for example).
Which leads to the other thing.... LaTex. It does not appear that the llama.cpp front-end handles LaTex.
...which finally leads to the first question in subject:
Is the llama.cpp-bundled front-end intended to remain somewhat simple, such that the above two items are outside the scope of this? This would be entirely understandable, of course. Just curious if any guidance has been provided regarding this.
Thank you very, very much for llama.cpp.
Beta Was this translation helpful? Give feedback.
All reactions