diff --git a/OS/TASKS.md b/OS/TASKS.md deleted file mode 100644 index b4a124f..0000000 --- a/OS/TASKS.md +++ /dev/null @@ -1,9 +0,0 @@ -- [ ] Decide: better queue? -
-So, Michael suggested we simply watch and filter the `dmesg` stream (I think that's what it's called?), so I suppose we could have a script like `/01/core/kernel_watch.py` that puts things into the queue? Honestly knowing we could get it all from one place like that— maybe this should be simpler. **Is the queue folder necessary?** How about we just expect the computer to send {"role": "computer"} messages to a POST endpoint at "/queue" or maybe "/inturrupt" or maybe "/" but with POST? When it gets those it puts them in the redis queue, which is checked frequently, so it's handled immediatly. So then yeah, maybe we do have redis there, then instead of looking at that folder, we check the redis queue. Is this better for any reason? Making the way computer messages are sent = an HTTP request, not putting a file in a folder? - - - -# For later -- [ ] Then we could have `/i` which other interpreter's hit. That behaves more like the OpenAI POST endpoint with stream=True by default (i think this is important for users to see the exchange happening in real time, streaming `event/stream` or whatever). You could imagine some kind of handshake btw— another interpreter → my interpreter's /i → the sender is unrecognized → computer message is sent to / which tells the user to have the interpreter send a specific code → the user tells the sending interpreter to use that specific code → the sender is recognized and added to friends-list (`computer.inetwork.friends()`) → now they can hit eachother's i endpoints freely with `computer.inetwork.friend(id).message("hey")`. -- [ ] When transfering skills that require OS control, the sender can replace those skills with that command, with one input "natural language query" (?) preceeded by the skill function name or something like that. Basically so if you ask it to do something you set up as a skill, it actually asks your computer to do it. If you ask your computer to do it directly, it's more direct. diff --git a/TEAMS.md b/TEAMS.md index 56b0fe9..3be61e7 100644 --- a/TEAMS.md +++ b/TEAMS.md @@ -1,6 +1,7 @@ # Teams ## Hardware + - Ben @humanbee - Ty @tyisfly - Use Michael as a recruitor @@ -10,6 +11,7 @@ - ..? ## Software + - Audio (TTS / SST) - Tasks: Streaming audio both ways. - Hardware limitations. What's the smallest hardware this can be on? @@ -19,20 +21,20 @@ - OI Core - Tasks: Computer API (schedule thing), skill library - Hristijan @thekeyq - - Aakash @ashgam._ + - Aakash @ashgam.\_ - Aniket @atneik - Shiven @shivenmian - Ty @tyisfly - Killian @killianlucas -- Backend - - Killian @killianlucas -- Linux / Firmware +- OS (formerly 'Linux / Firmware`) - Tasks: Virtualization? ISO? Putting sensors around the OS to put files into the queue. Bootloader. Networked input into the queue - Shiven @shivenmian + - Hristijan @thekeyq - Michael @mjjt - Zohaib @Zabirauf ## Experience + - Design - Arturo @arturot - Ronith @ronithk @@ -57,6 +59,7 @@ - Uli @ulidabess ## Comms + - Uli @ulidabess - Discord Community - Twitter Presence diff --git a/software/oi_core/TASKS.md b/software/oi_core/TASKS.md new file mode 100644 index 0000000..2c116ed --- /dev/null +++ b/software/oi_core/TASKS.md @@ -0,0 +1,3 @@ +- [ ] Release Open Interpreter `0.2.1` +- [ ] Meet to determine Computer API additions for the 01 +- [ ] Meet to decide how to build the skill library + skill recording diff --git a/software/oi_core/TEAM.md b/software/oi_core/TEAM.md new file mode 100644 index 0000000..b5b6810 --- /dev/null +++ b/software/oi_core/TEAM.md @@ -0,0 +1,8 @@ +- Hristijan @thekeyq +- Aakash @ashgam.\_ +- Aniket @atneik +- Shiven @shivenmian +- Ty @tyisfly +- Killian @killianlucas + +Team lead: Killian diff --git a/software/os/TASKS.md b/software/os/TASKS.md new file mode 100644 index 0000000..27b5f06 --- /dev/null +++ b/software/os/TASKS.md @@ -0,0 +1,13 @@ +- [ ] Modify bootloader. +- [ ] Decide: better queue? +
+ So, Michael suggested we simply watch and filter the `dmesg` stream (I think that's what it's called?), so I suppose we could have a script like `/01/core/kernel_watch.py` that puts things into the queue? Honestly knowing we could get it all from one place like that— maybe this should be simpler. **Is the queue folder necessary?** How about we just expect the computer to send {"role": "computer"} messages to a POST endpoint at "/queue" or maybe "/inturrupt" or maybe "/" but with POST? When it gets those it puts them in the redis queue, which is checked frequently, so it's handled immediatly. So then yeah, maybe we do have redis there, then instead of looking at that folder, we check the redis queue. Is this better for any reason? Making the way computer messages are sent = an HTTP request, not putting a file in a folder? +- [ ] Virtualization? +- [ ] Best workflow for pressing to an ISO? Cubic? +- [ ] Putting sensors around the OS to put things into the queue / `dmesg` implementation. +- [ ] Networked input into the queue? (Exploring this makes me thing the "/queue" or something endpoint is smarter to do than the "queue" folder) + +# For later + +- [ ] We could have `/i` which other interpreter's hit. That behaves more like the OpenAI POST endpoint with stream=True by default (i think this is important for users to see the exchange happening in real time, streaming `event/stream` or whatever). You could imagine some kind of handshake — another interpreter → my interpreter's /i → the sender is unrecognized → computer message is sent to /, prompting AI to ask the user to have the sending interpreter send a specific code → the user tells the sending interpreter to use that specific code → the sender is recognized and added to friends-list (`computer.inetwork.friends()`) → now they can hit eachother's i endpoints freely with `computer.inetwork.friend(id).message("hey")`. +- [ ] (OS team: this will require coordination with the OI core team, so let's talk about it / I'll explain at the next meetup.) When transfering skills that require OS control, the sender can replace those skills with that command, with one input "natural language query" (?) preceeded by the skill function name or something like that. Basically so if you ask it to do something you set up as a skill, it actually asks your computer to do it. If you ask your computer to do it directly, it's more direct. diff --git a/software/os/TEAM.md b/software/os/TEAM.md new file mode 100644 index 0000000..2a88b1d --- /dev/null +++ b/software/os/TEAM.md @@ -0,0 +1,5 @@ +- Shiven @shivenmian +- Hristijan @thekeyq +- Killian @killianlucas +- Michael @mjjt +- Zohaib @Zabirauf