v1.0.24
- fix prompt tree ui display (58db9ce)
- fix summary computation (8912f6a)
- removing filetypes (#88) (c037f33)
- cleanout audits (01fa8e2)
- generate summary along with changes (cb36ad0)
- Request cache (#87) (7daa286)
- fix missing oai token (d24c86f)
- updated gen-intro to not use output var (d256615)
- remove "output" parameter (#86) (c0d4056)
- files related to whitepaper sample (d0b4ebd)
- can't fool it. (535fb57)
- try to fool vscode (ffb76d7)
- add note in md (466c790)
- removed system.concise (1c0442e)
- little note about abort (4347cd4)
- renamed preview files (cb2b0f2)
- Updates to white paper demo (1b45394)
- Merge branch 'main' of https://github.com/microsoft/coarch (d483872)
- doc fix (028ffab)
- Using CoArch to write a CoArch white paper (065b139)
- Removing fragments mostly (#84) (ab5458c)
- faq skeleton (24f2f0c)
- move retry as first button (05d6c70)
- surface list of templates in prompt context (fc457c1)
- add flow to select next prompt (92880d5)
- display open ai token in status tooltip (f4a11d0)
- save if edit was applied (d49d50c)
- track edit state in request (b31fa1c)
- add application logic (1687341)
- add docs on llama token (b85bb59)
- support for llama model on TGI (78044e4)
- added code review to helloworld demo (c23c2c1)