- [ ] Create a system message that includes instructions on how to use the skill library.
- [ ] Test it end-to-end.
- [ ] Create computer.skills.teach(), which displays a message asking users to complete the task via voice in the most generalizable way possible. OI should use computer.mouse and computer.keyboard to fulfill each step, then save the generalized instruction as a skill. Clicking the mouse cancels teach mode. When OI invokes this skill in the future, it will just list those steps (it needs to figure out how to flexibly accomplish each step).
- [ ] Expose ^ via `interpreter --teach`.
- [ ] Add `interpreter --server --expose`.
- [ ] Include the 01's --server in the next OI update.
- [ ] Add --server --expose which will expose the server via something like Ngrok, display the public URL and a password, so the 01 Light can connect to it. This will let people use OI on their computer via their Light — i.e. "Check my emails" will run Applescript on their home computer.
- [ ] Why is OI starting so slowly? We could use time.time() around things to track it down.
- [ ] Create moondream-powered computer.camera.
- [ ] Computer.camera.view(query) should take a picture and ask moondream the query. Defaults to "Describe this image in detail."