diff --git a/CONTEXT.md b/CONTEXT.md index 7b53070..7fa9640 100644 --- a/CONTEXT.md +++ b/CONTEXT.md @@ -1,4 +1,4 @@ -# Takeaways +# Context 1. **Be minimal.** @@ -6,7 +6,7 @@ Both to developers (the 01 should be very programmer friendly) and to the end us 2. **Develop standards.** -That should be compatible with other popular systems. For example, I think [LMC messages](https://docs.openinterpreter.com/protocols/lmc-messages) should ~ work on OpenAI's API and vice versa. +That should be compatible with other popular systems. 3. **Resonate strongly with a niche.** diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 61c1171..40500e8 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,21 +1,13 @@ # ● -**01 is the world's first open-source Language Model Computer (LMC). 01OS is the operating system that powers it** - There are many ways to contribute, from helping others on [Github](https://github.com/KillianLucas/01/issues) or [Discord](https://discord.gg/Hvz9Axh84z), writing documentation, or improving code. -We depend on contributors like you. Let's build this. - ## What should I work on? -Please pick up a task from our [roadmap](https://github.com/KillianLucas/01/blob/main/ROADMAP.md) or work on solving an [issue](https://github.com/KillianLucas/01/issues). +Please pick up a task from [issues](https://github.com/KillianLucas/01/issues). If you encounter a bug or have a feature in mind, [search if an issue already exists](https://docs.github.com/en/github/searching-for-information-on-github/searching-on-github/searching-issues-and-pull-requests#search-by-the-title-body-or-comments). If a related issue doesn't exist, please [open a new issue](https://github.com/KillianLucas/01/issues/new/choose). -## Philosophy - -01OS embodies a philosophy of breaking free from technological limitations and knowledge gaps by leveraging AI for intuitive, natural language interactions, democratizing access to compute through open-source flexibility and transforming devices into responsive, human-centric computing tools. - # Contribution Guidelines 1. Before taking on significant code changes, please discuss your ideas on [Discord](https://discord.gg/Hvz9Axh84z) to ensure they align with our vision. We want to keep the codebase simple and unintimidating for new users. @@ -31,31 +23,13 @@ We will review PRs when possible and work with you to integrate your contributio Once you've forked the code and created a new branch for your work, you can run the fork by following these steps: -1. CD into the project folder `/01OS` +1. CD into the software folder `/software` 2. Install dependencies `poetry install` 3. Run the program `poetry run 01` -**Note**: This project uses [`black`](https://black.readthedocs.io/en/stable/index.html) and [`isort`](https://pypi.org/project/isort/) via a [`pre-commit`](https://pre-commit.com/) hook to ensure consistent code style. If you need to bypass it for some reason, you can `git commit` with the `--no-verify` flag. - -### Installing New Dependencies - -If you wish to install new dependencies into the project, please use `poetry add package-name`. - -### Installing Developer Dependencies - -If you need to install dependencies specific to development, like testing tools, formatting tools, etc. please use `poetry add package-name --group dev`. - -### Known Issues - -For some, `poetry install` might hang on some dependencies. As a first step, try to run the following command in your terminal: - -`export PYTHON_KEYRING_BACKEND=keyring.backends.fail.Keyring` - -Then run `poetry install` again. If this doesn't work, please join our [Discord community](https://discord.gg/Hvz9Axh84z) for help. - ## Code Formatting and Linting -Our project uses `black` for code formatting and `isort` for import sorting. To ensure consistency across contributions, please adhere to the following guidelines: +Our project uses [`black`](https://black.readthedocs.io/en/stable/index.html) for code formatting and [`isort`](https://pypi.org/project/isort/) for import sorting via a [`pre-commit`](https://pre-commit.com/) hook to ensure consistent code style across contributions. Please adhere to the following guidelines: 1. **Install Pre-commit Hooks**: @@ -63,7 +37,6 @@ Our project uses `black` for code formatting and `isort` for import sorting. To ```bash cd software # Change into `software` directory if not there already. - poetry shell # It's better to do it within the virtual environment of your project poetry add --dev pre-commit # Install pre-commit as a dev dependency pre-commit install ``` @@ -79,6 +52,22 @@ Our project uses `black` for code formatting and `isort` for import sorting. To isort . ``` +3. **Bypassing**: + + If you need to bypass this for some reason, you can `git commit` with the `--no-verify` flag. + +### Installing New Dependencies + +If you wish to install new dependencies into the project, please use `poetry add package-name`. + +### Known Issues + +For some, `poetry install` might hang on some dependencies. As a first step, try to run the following command in your terminal: + +`export PYTHON_KEYRING_BACKEND=keyring.backends.fail.Keyring` + +Then run `poetry install` again. If this doesn't work, please join our [Discord community](https://discord.gg/Hvz9Axh84z) for help. + # Licensing Contributions to 01 are under AGPL. diff --git a/GOALS.md b/GOALS.md deleted file mode 100644 index a3f58ba..0000000 --- a/GOALS.md +++ /dev/null @@ -1,21 +0,0 @@ -**The 01 Project** is comprised of the following goals, to be completed by _February 23rd, 2024_: - -
- -# 1. Create a blueprint -We will create a blueprint for a LMC (Language Model Computer) called the 01. - -
- -# 2. Publish a family of protocols -We will publish protocols to advance the LMC ecosystem. - -
- -# 3. Film a compelling video -This video will showcase the 01. - -
- -# 4. Build a physical device -Everyone on the core team will receive a functional device. diff --git a/INSPIRATION.md b/INSPIRATION.md deleted file mode 100644 index da1aa2a..0000000 --- a/INSPIRATION.md +++ /dev/null @@ -1,7 +0,0 @@ -| | | -|---|---| -| ![Image 13](https://github.com/KillianLucas/01/assets/63927363/7e7c179d-f0f7-4dd3-a3a0-6a750ba86f17) | ![Image 4](https://github.com/KillianLucas/01/assets/63927363/a920b172-179b-48ad-b21b-aa016955ee93) | -| ![Image 9](https://github.com/KillianLucas/01/assets/63927363/18c4a7d7-ce15-4597-ad90-28d0133321dd) | ![Image 8](https://github.com/KillianLucas/01/assets/63927363/d93bb4b0-dada-41c2-94aa-e156f40e4e00) | -| ![Image 7](https://github.com/KillianLucas/01/assets/63927363/cae5fa56-3016-4d5c-a2d9-2d1a0bb8ead7) | ![Image 6](https://github.com/KillianLucas/01/assets/63927363/7c502082-336b-436b-ab69-605878451592) | -| ![Image 5](https://github.com/KillianLucas/01/assets/63927363/bcaafacd-8af0-42a0-a3d5-91b1f1769311) | ![Image 10](https://github.com/KillianLucas/01/assets/63927363/9d1fc091-d19a-4b22-9866-90a0711e0f3d) | -| ![Image 3](https://github.com/KillianLucas/01/assets/63927363/51c0f95d-f8b7-4e2e-b4f4-f8beea219b88) | | diff --git a/README.md b/README.md index 378e9f7..59e0b36 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ Discord

- The open-source language model computer.
+ The #1 open-source voice interface.

Get Updates‎ ‎ |‎ ‎ Documentation

@@ -26,14 +26,20 @@ We want to help you build. [Apply for 1-on-1 support.](https://0ggfznkwh4j.typef > [!IMPORTANT] > This experimental project is under rapid development and lacks basic safeguards. Until a stable `1.0` release, only run this repository on devices without sensitive information or access to paid services. -> -> **A substantial rewrite to address these concerns and more, including the addition of [RealtimeTTS](https://github.com/KoljaB/RealtimeTTS) and [RealtimeSTT](https://github.com/KoljaB/RealtimeSTT), is occurring [here](https://github.com/KillianLucas/01-rewrite/tree/main).**
-**The 01 Project** is building an open-source ecosystem for AI devices. +The **01** is an open-source platform for conversational devices, inspired by the *Star Trek* computer. -Our flagship operating system can power conversational devices like the Rabbit R1, Humane Pin, or [Star Trek computer](https://www.youtube.com/watch?v=1ZXugicgn6U). +With [Open Interpreter](https://github.com/OpenInterpreter/open-interpreter) at its core, the **01** is more natural, flexible, and capable than its predecessors. Assistants built on **01** can: + +- Execute code +- Browse the web +- Read and create files +- Control third-party software +- ... + +
We intend to become the GNU/Linux of this space by staying open, modular, and free. @@ -59,7 +65,7 @@ poetry run 01 # Runs the 01 Light simulator (hold your spacebar, speak, release)
-**The [RealtimeTTS](https://github.com/KoljaB/RealtimeTTS) and [RealtimeSTT](https://github.com/KoljaB/RealtimeSTT) libraries in the incoming 01-rewrite are thanks to the state-of-the-art voice interface work of [Kolja Beigel](https://github.com/KoljaB). Please star those repos and consider contributing to / utilizing those projects!** +**Note:** The [RealtimeTTS](https://github.com/KoljaB/RealtimeTTS) and [RealtimeSTT](https://github.com/KoljaB/RealtimeSTT) libraries at the heart of the 01 are thanks to the voice interface work of [Kolja Beigel](https://github.com/KoljaB). Please star those repos and consider contributing to / utilizing those projects. # Hardware @@ -73,7 +79,7 @@ poetry run 01 # Runs the 01 Light simulator (hold your spacebar, speak, release) # What does it do? -The 01 exposes a speech-to-speech websocket at `localhost:10001`. +The 01 exposes a speech-to-speech websocket at `localhost:10101`. If you stream raw audio bytes to `/` in [Streaming LMC format](https://docs.openinterpreter.com/guides/streaming-response), you will receive its response in the same format. @@ -112,7 +118,7 @@ To run the server on your Desktop and connect it to your 01 Light, run the follo ```shell brew install ngrok/ngrok/ngrok ngrok authtoken ... # Use your ngrok authtoken -poetry run 01 --server --expose +poetry run 01 --server light --expose ``` The final command will print a server URL. You can enter this into your 01 Light's captive WiFi portal to connect to your 01 Server. @@ -120,11 +126,9 @@ The final command will print a server URL. You can enter this into your 01 Light ## Local Mode ``` -poetry run 01 --local +poetry run 01 --profile local.py ``` -If you want to run local speech-to-text using Whisper, you must install Rust. Follow the instructions given [here](https://www.rust-lang.org/tools/install). - ## Customizations To customize the behavior of the system, edit the [system message, model, skills library path,](https://docs.openinterpreter.com/settings/all-settings) etc. in the `profiles` directory under the `server` directory. This file sets up an interpreter, and is powered by Open Interpreter. @@ -157,10 +161,6 @@ Visit [our roadmap](/ROADMAP.md) to see the future of the 01. The story of devices that came before the 01. -### [Inspiration ↗](https://github.com/KillianLucas/01/tree/main/INSPIRATION.md) - -Things we want to steal great ideas from. -
○ diff --git a/ROADMAP.md b/ROADMAP.md index 58938b3..f16124d 100644 --- a/ROADMAP.md +++ b/ROADMAP.md @@ -1,12 +1,13 @@ # Roadmap -Our goal is to power a billion devices with the 01OS over the next 10 years. The Cambrian explosion of AI devices. +Our goal is to power a billion devices with the 01's software over the next 10 years. -We can do that with your help. Help extend the 01OS to run on new hardware, to connect with new peripherals like GPS and cameras, and add new locally running language models to unlock use-cases for this technology that no-one has even imagined yet. +We can do that with your help. Help extend the 01 to run on new hardware, to connect with new peripherals like GPS and cameras, and to add new locally running language models to unlock use-cases for this technology that no-one has imagined. In the coming months, we're going to release: -- [ ] Add support for Azure and PlayHT for fast latency -- [ ] An open-source language model for computer control - [ ] A react-native app for your phone -- [ ] A hand-held device that runs fully offline. +- [ ] Speech-to-speech model support (like `gpt-4o`) instead of TTS/STT + - [ ] Implement `Ultravox` +- [ ] An open-source language model for computer control +- [ ] A hand-held device that runs fully offline. \ No newline at end of file diff --git a/TASKS.md b/TASKS.md deleted file mode 100644 index 946e214..0000000 --- a/TASKS.md +++ /dev/null @@ -1,92 +0,0 @@ -**OI** - -- [ ] Finish skill library. - - [ ] Create a system message that includes instructions on how to use the skill library. - - [ ] Test it end-to-end. - - [ ] Make sure it works with computer.skills.search (it should already work) - - [ ] Create computer.skills.teach() - - [ ] Displays a tkinter message asking users to complete the task via text (eventually voice) in the most generalizable way possible. OI should use computer.mouse and computer.keyboard to fulfill each step, then save the generalized instruction as a skill. Clicking the mouse cancels teach mode. When OI invokes this skill in the future, it will just list those steps (it needs to figure out how to flexibly accomplish each step). - - [ ] Computer: "What do you want to name this skill?" - - [ ] User: Enters name in textbox - - [ ] Computer: "Whats the First Step" - - [ ] User: textbox appears types instructions - - [ ] Textbox disappears - - [ ] OI follows instruction - - [ ] "Did that work?" Yes/No? - - [ ] If No: repeat step training - - [ ] Computer: "Great! What's the next step?" .... - - [ ] Repeat until all steps of skill are completed - - [ ] Save skill as a function next() steps through user's steps - - [ ] Expose ^ via `01 --teach`. -- [ ] pip install 01 - - [ ] Add `01 --server --expose`. - - [ ] Add --server --expose which will expose the server via something like Ngrok, display the public URL and a password, so the 01 Light can connect to it. This will let people use OI on their computer via their Light — i.e. "Check my emails" will run Applescript on their home computer. -- [ ] Sync Interpreter/Computer between code blocks -- [ ] New default dynamic system message with computer API + skills. - - [ ] Develop default system message for executive assistant. - - [ ] Better local system message -- [ ] write good docstrings for computer API -- [ ] Inject computer API into python routine -- [ ] determine streaming LMC protocol - - [ ] inlcude headers? -- [ ] Why is OI starting so slowly? We could use time.time() around things to track it down. -- [ ] Create moondream-powered computer.camera. - - [ ] Computer.camera.view(query) should take a picture and ask moondream the query. Defaults to "Describe this image in detail." - - [ ] Takes Picture - - [ ] Sends to describe API - - [ ] prints and returns description - - [ ] Llamafile for phi-2 + moondream - - [ ] test on rPi + Jetson (+android mini phone?) - -**OS** - -- [ ] Queue speech results - - [ ] TTS sentences should be queued + playback should stop once button is pressed -- [ ] expose server using Ngrok -- [ ] Swap out the current hosted functions for local ones. - - [ ] TTS — Piper? OpenVoice? Rasspy? - - [ ] STT — Whisper? Canary? - - [ ] LLM — Phi-2 Q4 Llamafile, just need to download it, OI knows how to use Llamafiles -- [ ] Functional Requirements - - [ ] for initial user setup and first experience - - [ ] If Light and no internet, open a captive wifi page with text boxes: Wifi Name, Wifi Pass, (optional) Server URL, (optional) Server Pass - - [ ] in device.py -- [ ] Camera input from user in device.py -- [ ] Can tapping the mic twice = trigger pressing the "button"? Simple sensing, just based on volume spikes? -- [ ] Update Architecture - - [ ] Base Devise Class - - [ ] Separate folders for Rasberry Pi, Desktop, Droid, App, Web - - [ ] device.py for each folder has input logic for that device - - [ ] Add basic TUI to device.py. Just renders messages and lets you add messages. Can easily copy OI's TUI. - - [ ] index.html for each folder has user interface for that device - - [ ] Web is just index.html - - [ ] Display.html? gui.html? -- [ ] Replace bootloader and boot script— should just run 01, full screen TUI. -- [ ] Package it as an ISO, or figure out some other simple install instructions. How to easily install on a Pi? - -**Hardware** - -- [ ] (Hardware and software) Get the 01OS working on the **Jetson** or Pi. Pick one to move forward with. -- [ ] Connect the Seeed Sense (ESP32 with Wifi, Bluetooth and a mic) to a small DAC + amplifier + speaker. -- [ ] Connect the Seeed Sense to a battery. -- [ ] Configure the ESP32 to be a wireless mic + speaker for the Jetson or Pi. -- [ ] Connect the Jetson or Pi to a battery. -- [ ] Make a rudimentary case for the Seeed Sense + speaker. Optional. -- [ ] Make a rudimentary case for the Jetson or Pi. Optional. - -**Release Day** - -- [ ] Launch video "cambriah explosion" 3d Sketch -- [ ] Create form to get pre-release feedback from 200 interested people (who responded to Killian's tweet) - -**DONE** - -- [ ] Get Local TTS working on Mac [Shiven] -- [ ] Get Local SST working on Mac [Zohaib + Shiven] -- [ ] Debug level logging/printing [Tom] -- [ ] Get hardware (mic, speaker, button) working on the rPi (running on battery) [Ty] -- [ ] device.py conditionals for platform [Ty] -- [ ] Kernal filtering issues [Tom] -- [ ] .env file [Tom] -- [ ] Save computer messages in User.json [Kristijan] -- [ ] Service Management [Zach] diff --git a/TEAMS.md b/TEAMS.md deleted file mode 100644 index af0ddb9..0000000 --- a/TEAMS.md +++ /dev/null @@ -1,69 +0,0 @@ -# Teams - -## Hardware - -- Ben @humanbee -- Ty @tyisfly -- Use Michael as a recruitor -- Shiven @shivenmian -- Jacob Weisel -- Aniket @atneik -- ..? - -## Software - -- Audio (TTS / SST) - - Tasks: Streaming audio both ways. - - Hardware limitations. What's the smallest hardware this can be on? - - Zach @zwf - - Zohaib @Zabirauf - - Atai @atai_copilotkit -- OI Core - - Tasks: Computer API (schedule thing), skill library - - Hristijan @thekeyq - - Aakash @ashgam.\_ - - Aniket @atneik - - Shiven @shivenmian - - Ty @tyisfly - - Killian @killianlucas -- OS (formerly 'Linux / Firmware`) - - Tasks: Virtualization? ISO? Putting sensors around the OS to put files into the queue. Bootloader. Networked input into the queue - - Shiven @shivenmian - - Hristijan @thekeyq - - Michael @mjjt - - Zohaib @Zabirauf - -## Experience - -- Design - - Arturo @arturot - - Ronith @ronithk - - Danny @dannytayara - - Killian @killianlucas - - Aniket @atneik - - Alim - - Eschwa? - - Industrial - - Interface - - Web - - Brand / Video - - Arturo @arturot - - Killian @killianlucas - - Matt @matt_rygh - - Finn -- Research - - Ben @humanbee - - Use-cases - - Tasks: Send out typeform—what are motivating examples? - - Testing - - Uli @ulidabess - -## Comms - -- Uli @ulidabess -- Discord Community -- Twitter Presence - - Killian @killianlucas -- Press - - Michael @mjjt - - Zach (connection at NYT) @zwf diff --git a/USE_CASES.md b/USES.md similarity index 100% rename from USE_CASES.md rename to USES.md diff --git a/docs/README.md b/docs/README.md index 7e065b0..1543777 100644 --- a/docs/README.md +++ b/docs/README.md @@ -1,6 +1,4 @@ -# 01OS - -## Contribute +## Contribute to these docs - Clone this repo - install mintlify CLI - run mintlify in project directory to get preview of docs diff --git a/docs/snippets/snippet-intro.mdx b/docs/snippets/snippet-intro.mdx deleted file mode 100644 index c57e7c7..0000000 --- a/docs/snippets/snippet-intro.mdx +++ /dev/null @@ -1,4 +0,0 @@ -One of the core principles of software development is DRY (Don't Repeat -Yourself). This is a principle that apply to documentation as -well. If you find yourself repeating the same content in multiple places, you -should consider creating a custom snippet to keep your content in sync. diff --git a/hardware/light/wiring-diagram.jpg b/hardware/light/wiring-diagram.jpg deleted file mode 100644 index 7819289..0000000 Binary files a/hardware/light/wiring-diagram.jpg and /dev/null differ diff --git a/hardware/light/Labeled Wiring Diagram.png b/hardware/light/wiring-diagram.png similarity index 100% rename from hardware/light/Labeled Wiring Diagram.png rename to hardware/light/wiring-diagram.png diff --git a/project_management/communication/TEAM.md b/project_management/communication/TEAM.md deleted file mode 100644 index a820042..0000000 --- a/project_management/communication/TEAM.md +++ /dev/null @@ -1,3 +0,0 @@ -- Uli @ulidabess -- Ben @humanbee -- Killian @killianlucas diff --git a/project_management/communication/press/TEAM.md b/project_management/communication/press/TEAM.md deleted file mode 100644 index df7836c..0000000 --- a/project_management/communication/press/TEAM.md +++ /dev/null @@ -1,3 +0,0 @@ -- Michael @mjjt -- Zach @zwf (connection at NYT) -- Killian @killianlucas diff --git a/project_management/experience/TEAM.md b/project_management/experience/TEAM.md deleted file mode 100644 index 4a9e1f3..0000000 --- a/project_management/experience/TEAM.md +++ /dev/null @@ -1,7 +0,0 @@ -- Arturo @arturot -- Ronith @ronithk -- Danny @dannytayara -- Killian @killianlucas -- Aniket @atneik -- [Alim?](https://twitter.com/almmaasoglu) -- [ESchwaa?](https://twitter.com/ESchwaa) diff --git a/project_management/experience/design/TASKS.md b/project_management/experience/design/TASKS.md deleted file mode 100644 index 28abf58..0000000 --- a/project_management/experience/design/TASKS.md +++ /dev/null @@ -1,5 +0,0 @@ -- [ ] What does 01OS look like when you boot it up? -- [ ] What does 01OS look like when it's running? -- [ ] What does the 01 website look like? - -Awaiting hardware design decisions until hardware team has decided if we're starting from scratch or repurposing. diff --git a/project_management/experience/research/TASKS.md b/project_management/experience/research/TASKS.md deleted file mode 100644 index 44a7ccc..0000000 --- a/project_management/experience/research/TASKS.md +++ /dev/null @@ -1 +0,0 @@ -- [ ] Send out typeform to remote team — what are motivating use-cases? diff --git a/project_management/experience/research/TEAM.md b/project_management/experience/research/TEAM.md deleted file mode 100644 index be7386c..0000000 --- a/project_management/experience/research/TEAM.md +++ /dev/null @@ -1,2 +0,0 @@ -- Ben @humanbee -- Uli @ulidabess diff --git a/project_management/experience/video_and_brand/TEAM.md b/project_management/experience/video_and_brand/TEAM.md deleted file mode 100644 index f826e2e..0000000 --- a/project_management/experience/video_and_brand/TEAM.md +++ /dev/null @@ -1,4 +0,0 @@ -- Arturo @arturot -- Killian @killianlucas -- Matt @matt_rygh -- Finn diff --git a/project_management/hardware/OPTIONS.md b/project_management/hardware/OPTIONS.md deleted file mode 100644 index cff7c03..0000000 --- a/project_management/hardware/OPTIONS.md +++ /dev/null @@ -1,33 +0,0 @@ -### Non-pre-made hardware - -1. Raspberry Pi -2. Raspberry Pi + Coral.ai Accelerator -3. Coral.ai Devboard - -### Assembly-required OSS hardware - -1. [The Raspberry Pi Recovery kit by Jay Doscher.](https://www.doscher.com/work-recovery-kit/) "A MOBILE TERMINAL FOR THE END OF THE WORLD - ". I bet we could reach out to him and have him send some tbh. - -![JAY02105](https://github.com/KillianLucas/01/assets/63927363/14b7438f-fe4c-45ed-86ab-17538c1fc600) - -### Ready to buy, OSS hardware - -1. [Clockwork's uConsole](https://www.clockworkpi.com/product-page/uconsole-kit-rpi-cm4-lite) - -![3833f7_9e9fc3ed88534fb0b1eae043b3d5906e~mv2](https://github.com/KillianLucas/01/assets/63927363/ae2bd1f7-ffdf-42e6-87f8-2beb7e3145c6) - -2. [Clockwork's Devterm](https://www.clockworkpi.com/product-page/devterm-kit-r01) - -![3833f7_4f7e8e064a984027bddff865db0ca1b7~mv2](https://github.com/KillianLucas/01/assets/63927363/ee8cbfd4-bcb1-4eac-8c4d-d864fe3a0266) - -### Ready to buy, non-OSS hardware - -Can we modify the OS on these things? Some are OEM, which I think means we can contact the manufacturer and ask for changes. - -1. [Conference speaker](https://www.amazon.com/dp/B0CCP1J8QW/ref=sspa_dk_detail_0?psc=1&pd_rd_i=B0CCP1J8QW&pd_rd_w=0wR2S&content-id=amzn1.sym.d81b167d-1f9e-48b6-87d8-8aa5e473ea8c&pf_rd_p=d81b167d-1f9e-48b6-87d8-8aa5e473ea8c&pf_rd_r=60DJHP5JV1DJ0BJ3V7N4&pd_rd_wg=OUF4S&pd_rd_r=c4d7e254-7b9e-4025-a252-7851ef880a18&s=musical-instruments&sp_csd=d2lkZ2V0TmFtZT1zcF9kZXRhaWxfdGhlbWF0aWM) -2. [Smartwatch](https://www.amazon.com/Parsonver-Smartwatch-Bluetooth-Activity-Pedometer/dp/B0BPM16KVM/ref=sr_1_22_sspa?keywords=voice%2Bassistant%2Bandroid&qid=1706051147&sr=8-22-spons&ufe=app_do%3Aamzn1.fos.006c50ae-5d4c-4777-9bc0-4513d670b6bc&sp_csd=d2lkZ2V0TmFtZT1zcF9tdGY&th=1) -3. [Smartwatch that looks like the 01 Light](https://www.alibaba.com/product-detail/MTL135-Reloj-Android-Smartwatch-2023-Montre_1600707760136.html?spm=a2700.galleryofferlist.normal_offer.d_image.24af7083iEzmhs) -4. [Smartwatch that looks like a square 01 Light](https://www.alibaba.com/product-detail/2023-Newest-4g-Sim-Call-S8_1600898456587.html?spm=a2700.galleryofferlist.normal_offer.d_image.2e9f70836cO7ae) -5. [Mic + speaker + button](https://www.alibaba.com/product-detail/Wholesale-CHATGPT4-0-ODM-OEM-Microphone_1601008248994.html?spm=a2700.galleryofferlist.p_offer.d_title.25ec7a08qFPP5l&s=p) -6. [shit, is the 01 Heavy just a weird laptop](https://www.alibaba.com/product-detail/8-Inch-Mini-Pocket-Laptop-Tablet_1600842995304.html) diff --git a/project_management/hardware/TASKS.md b/project_management/hardware/TASKS.md deleted file mode 100644 index 3b4ef39..0000000 --- a/project_management/hardware/TASKS.md +++ /dev/null @@ -1,9 +0,0 @@ -- [ ] **Should we just buy pre-made hardware?** - -Some bodies— like the 01 Light without a screen and with a camera, and the 01 Heavy as a screenless tape-recorder with a camera and a button on the side— do not exist. So perhaps we should make CAD files and build them. - -Other bodies we've floated already exist as Android phones (isn't the Rabbit R1 basically an Android phone?) smartwatches, laptops, and cyberdecks. - -I think we should decide by 1) estimating how long custom hardware would take, and 2) weighing it against how _memorable of an impression_ the 01 would make if it did/did't have custom hardware. Holding something unique is a big part of this. But we might accomplish that by using some of the more bizarre looking hardware. - -[Check out some of the options on the table here.](OPTIONS.md) diff --git a/project_management/hardware/TEAM.md b/project_management/hardware/TEAM.md deleted file mode 100644 index 9189413..0000000 --- a/project_management/hardware/TEAM.md +++ /dev/null @@ -1,7 +0,0 @@ -- Ben @humanbee -- Ty @tyisfly -- Shiven @shivenmian -- Jacob Weisel -- Aniket @atneik - -* for later, Michael offered to recruit more to this team diff --git a/project_management/hardware/devices/jetson-nano/README.md b/project_management/hardware/devices/jetson-nano/README.md deleted file mode 100644 index 08a7c02..0000000 --- a/project_management/hardware/devices/jetson-nano/README.md +++ /dev/null @@ -1,22 +0,0 @@ -# Development Setup for Jetson Nano - -1. Go through the tutorial here: https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#intro - -2. At the end of that guide, you should have a Jetson running off a power supply or micro USB. - -3. Get network connectivity. The Jetson does not have a WiFi module so you will need to plug in ethernet. - If you have a laptop, you can share internet access over Ethernet. - - To do this with Mac, do the following: - - a. Plug a cable from the Jetson Ethernet port to your Mac (you can use a Ethernet -> USB converter for your Mac). - - b. Go to General->Sharing, then click the little `(i)` icon next to "Internet Sharing", and check all the options. - - ![](mac-share-internet.png) - - c. Go back to General->Sharing, and turn on "Internet Sharing". - - ![](mac-share-internet-v2.png) - - d. Now the Jetson should have connectivity! diff --git a/project_management/hardware/devices/jetson-nano/mac-share-internet-v2.png b/project_management/hardware/devices/jetson-nano/mac-share-internet-v2.png deleted file mode 100644 index 74e1de4..0000000 Binary files a/project_management/hardware/devices/jetson-nano/mac-share-internet-v2.png and /dev/null differ diff --git a/project_management/hardware/devices/jetson-nano/mac-share-internet.png b/project_management/hardware/devices/jetson-nano/mac-share-internet.png deleted file mode 100644 index 51aaa5d..0000000 Binary files a/project_management/hardware/devices/jetson-nano/mac-share-internet.png and /dev/null differ diff --git a/project_management/hardware/devices/raspberry-pi/README.md b/project_management/hardware/devices/raspberry-pi/README.md deleted file mode 100644 index f949fb9..0000000 --- a/project_management/hardware/devices/raspberry-pi/README.md +++ /dev/null @@ -1,79 +0,0 @@ -# How to set up 01 on a Raspberry Pi - -## Supplies needed - -- Raspberry Pi 5 -- Micro SD Card -- USB-C cable -- Micro HDMI to HDMI cable -- Monitor -- Keyboard -- Mouse -- USB Microphone ([like this one](https://www.amazon.com/dp/B071WH7FC6?psc=1&ref=ppx_yo2ov_dt_b_product_details)) -- USB or Bluetooth speaker -- Breadboard, jumper wires, 220R resistor and button (a kit like [this one](https://www.amazon.com/Smraza-Electronics-Potentiometer-tie-Points-Breadboard/dp/B0B62RL725/ref=sr_1_20?crid=MQDBAOQU7RYY&keywords=breadboard+kit&qid=1707665692&s=electronics&sprefix=breadboard%2Celectronics%2C346&sr=1-20) has everything you need) - -## SD card setup - -- Flash a new sd card using [Raspberry Pi Imager](https://www.raspberrypi.com/software/) - - Pick your device (only tested on Raspberry Pi 5) - - Select the OS: Scroll down to "Other General OS" Then select Ubuntu Desktop 64bit - - Select the storage: Select your sd card - - Proceed to flashing by selecting "Write" - -## Hardware set up - -- Connect Raspberry pi board to USB-C power -- Connect a keyboard, mouse, and mic to the USB ports -- Connect a monitor to the micro HDMI port -- Insert your newly flashed SD card into the slot under the device by the power button -- Power it on with the power button -- Hook up the Button to the breadboard,it should look like this: - ![Button](button-diagram.png) - -## Ubuntu set up - -- Go through the system configuration on start up: - - Make sure to connect to wifi, we will need it to install 01 and it's packages - - Choose a password you will remember, you will need it later -- Open terminal -- `sudo apt update && sudo apt upgrade -y` - - Sometimes `dpkg` will complain, if it does, run `sudo dpkg --configure -a` and then run the update and upgrade commands again - -Clone the repo: - -- `sudo apt install git -y` -- `git clone https://github.com/KillianLucas/01` -- `cd 01/OS/01/` - -Set up a virtual environment: - -- `sudo apt install python3-venv -y` -- `python3 -m venv venv` -- `source venv/bin/activate` - -Install packages: - -- `sudo apt install ffmpeg portaudio19-dev` (ffmpeg and portaudio19-dev need to be installed with apt on linux) -- `sudo apt-get update` -- `sudo apt-get install gpiod` -- `pip install -r requirements.txt` -- pyaudio install might fail, these commands should fix it: - - - `sudo apt-get install gcc make python3-dev portaudio19-dev` - - `pip install pyaudio` - -Rename and edit the .env file: - -- `mv .env.example .env` (rename the .env file) -- Add your OpenAI key to the .env file, or by running `export OPENAI_API_KEY="sk-..."` - - To add it to the .env in the terminal, run `nano .env` - - Add the key to the `OPENAI_API_KEY` line - - Save and exit by pressing `ctrl + x`, then `y`, then `enter` - -Run the start script: - -- `bash start.sh` - - There may be a few packages that didn't install, yielding a 'ModuleNotFoundError' error. If you see this, manually install each of them with pip and retry the `bash start.sh` command. - -Done! You should now be able to use 01 on your Raspberry Pi 5, and use the button to invoke the assistant. diff --git a/project_management/hardware/devices/raspberry-pi/button-diagram.png b/project_management/hardware/devices/raspberry-pi/button-diagram.png deleted file mode 100644 index 64456e2..0000000 Binary files a/project_management/hardware/devices/raspberry-pi/button-diagram.png and /dev/null differ diff --git a/project_management/meetups/01-20-24.md b/project_management/meetups/01-20-24.md deleted file mode 100644 index 087f426..0000000 --- a/project_management/meetups/01-20-24.md +++ /dev/null @@ -1,20 +0,0 @@ -# January 20th, 2024 - -At our first meetup, we discussed the context and future of the six-week project and I laid out [four goals](https://github.com/KillianLucas/01/blob/main/GOALS.md). - -### [Presentation Slides ↗](https://www.canva.com/design/DAF56kADkyc/2IgFkCuPoUg5lmv6-gGadg/view?utm_content=DAF56kADkyc&utm_campaign=designshare&utm_medium=link&utm_source=editor) - -## Whiteboards - -Regarding the minimal body: - -![IMG_6280](https://github.com/KillianLucas/01/assets/63927363/6e0f833a-ffab-43ff-99b3-0914ff0a34db) - -Regarding the heavy body: - -![IMG_6282](https://github.com/KillianLucas/01/assets/63927363/c06bd0f5-eef8-4e26-83ec-0afeaa07eab6) - -## Decisions - -1. We'll try to build around the use-cases, some of which [I have compiled here.](https://github.com/KillianLucas/01/blob/main/USE_CASES.md) If you think of more please make a PR. -2. We want to design two bodies to house the 01, one will be very minimal and require an internet connection (possible names: The 01 **Light**, The 01 **Click**, or The 01 **Feather**) and another will run fully locally (The 01 **Heavy**). diff --git a/project_management/software/audio/TASKS.md b/project_management/software/audio/TASKS.md deleted file mode 100644 index 11a8d82..0000000 --- a/project_management/software/audio/TASKS.md +++ /dev/null @@ -1,3 +0,0 @@ -- [ ] STT implementation — Can we get a bash script that we can run on startup that starts a whisper.cpp tiny binary with an endpoint to connect to it (or something) so script.js can stream audio to it? -- [ ] TSS implementation — Same as above ^ bash script that starts Rhasspy then some way to connect script.js to it? -- [ ] Hardware limitations / find minimum requirements for this to be performant. What's the shittiest hardware this can be run on? diff --git a/project_management/software/audio/TEAM.md b/project_management/software/audio/TEAM.md deleted file mode 100644 index a6e5da3..0000000 --- a/project_management/software/audio/TEAM.md +++ /dev/null @@ -1,5 +0,0 @@ -- Zach @zwf -- Zohaib @Zabirauf -- Atai @atai_copilotkit - -Team lead: Zach diff --git a/project_management/software/oi_core/TASKS.md b/project_management/software/oi_core/TASKS.md deleted file mode 100644 index 2c116ed..0000000 --- a/project_management/software/oi_core/TASKS.md +++ /dev/null @@ -1,3 +0,0 @@ -- [ ] Release Open Interpreter `0.2.1` -- [ ] Meet to determine Computer API additions for the 01 -- [ ] Meet to decide how to build the skill library + skill recording diff --git a/project_management/software/oi_core/TEAM.md b/project_management/software/oi_core/TEAM.md deleted file mode 100644 index b5b6810..0000000 --- a/project_management/software/oi_core/TEAM.md +++ /dev/null @@ -1,8 +0,0 @@ -- Hristijan @thekeyq -- Aakash @ashgam.\_ -- Aniket @atneik -- Shiven @shivenmian -- Ty @tyisfly -- Killian @killianlucas - -Team lead: Killian diff --git a/project_management/software/os/TASKS.md b/project_management/software/os/TASKS.md deleted file mode 100644 index 85f452a..0000000 --- a/project_management/software/os/TASKS.md +++ /dev/null @@ -1,13 +0,0 @@ -- [ ] Modify bootloader. -- [ ] Decide: better queue? -
- So, Michael suggested we simply watch and filter the `dmesg` stream (I think that's what it's called?), so I suppose we could have a script like `/01/core/kernel_watch.py` that puts things into the queue? Honestly knowing we could get it all from one place like that— maybe this should be simpler. **Is the queue folder necessary?** How about we just expect the computer to send {"role": "computer"} messages to a POST endpoint at "/queue" or maybe "/inturrupt" or maybe "/" but with POST? When it gets those it puts them in the redis queue, which is checked frequently, so it's handled immediatly. So then yeah, maybe we do have redis there, then instead of looking at that folder, we check the redis queue. Is this better for any reason? Making the way computer messages are sent = an HTTP request, not putting a file in a folder? -- [ ] Virtualization? -- [ ] Best workflow for pressing to an ISO? Cubic? -- [ ] Putting sensors around the OS to put things into the queue / `dmesg` implementation. -- [ ] Networked input into the queue? (Exploring this makes me thing the "/queue" or something endpoint is smarter to do than the "queue" folder) - -# For later - -- [ ] We could have `/i` which other interpreter's hit. That behaves more like the OpenAI POST endpoint with stream=True by default (i think this is important for users to see the exchange happening in real time, streaming `event/stream` or whatever). You could imagine some kind of handshake — another interpreter → my interpreter's /i → the sender is unrecognized → computer message is sent to /, prompting AI to ask the user to have the sending interpreter send a specific code → the user tells the sending interpreter to use that specific code → the sender is recognized and added to friends-list (`computer.inetwork.friends()`) → now they can hit eachother's i endpoints freely with `computer.inetwork.friend(id).message("hey")`. -- [ ] (OS team: this will require coordination with the OI core team, so let's talk about it / I'll explain at the next meetup.) When transfering skills that require OS control, the sender can replace those skills with that command, with one input "natural language query" (?) proceeded by the skill function name or something like that. Basically so if you ask it to do something you set up as a skill, it actually asks your computer to do it. If you ask your computer to do it directly, it's more direct. diff --git a/project_management/software/os/TEAM.md b/project_management/software/os/TEAM.md deleted file mode 100644 index 2a88b1d..0000000 --- a/project_management/software/os/TEAM.md +++ /dev/null @@ -1,5 +0,0 @@ -- Shiven @shivenmian -- Hristijan @thekeyq -- Killian @killianlucas -- Michael @mjjt -- Zohaib @Zabirauf diff --git a/run_pytest.py b/run_pytest.py deleted file mode 100644 index 1a9ce25..0000000 --- a/run_pytest.py +++ /dev/null @@ -1,36 +0,0 @@ -import subprocess -import sys -import ctypes -import os - - -def main(): - """Run pytest in the software directory. - - This script is intended to be used as a pre-commit hook to run the tests from the root of the repository. - """ - - # Additional setup for Windows (10 at least) to prevent issues with Unicode characters in the console. - # see https://www.reddit.com/r/learnpython/comments/350c8c/unicode_python_3_and_the_windows_console/ - if sys.platform.startswith("win"): - # Force UTF-8 encoding in Python - os.environ["PYTHONUTF8"] = "1" - - # Change Windows console code page to UTF-8 - ctypes.windll.kernel32.SetConsoleCP(65001) - ctypes.windll.kernel32.SetConsoleOutputCP(65001) - - # Define the target directory relative to this script location. - target_directory = os.path.join(os.path.dirname(__file__), "software") - - os.chdir(target_directory) - - # Run pytest with any additional arguments passed to this script. - result = subprocess.run(["pytest"] + sys.argv[1:]) - - # Exit with pytest's exit code to reflect the test outcome in the pre-commit hook. - sys.exit(result.returncode) - - -if __name__ == "__main__": - main() diff --git a/software/.cursorignore b/software/.cursorignore deleted file mode 100644 index 7a81b42..0000000 --- a/software/.cursorignore +++ /dev/null @@ -1,3 +0,0 @@ -_archive -__pycache__ -.idea diff --git a/software/start.py b/software/main.py similarity index 91% rename from software/start.py rename to software/main.py index d63b553..a437fe7 100644 --- a/software/start.py +++ b/software/main.py @@ -11,14 +11,17 @@ ... --qr # Displays a qr code """ +from yaspin import yaspin +spinner = yaspin() +spinner.start() + import typer import ngrok import platform import threading import os import importlib -from source.server.tunnel import create_tunnel -from source.server.async_server import start_server +from source.server.server import start_server import subprocess import socket import json @@ -124,11 +127,14 @@ def run( if server == "light": light_server_port = server_port + voice = True # The light server will support voice elif server == "livekit": # The light server should run at a different port if we want to run a livekit server + spinner.stop() print(f"Starting light server (required for livekit server) on the port before `--server-port` (port {server_port-1}), unless the `AN_OPEN_PORT` env var is set.") print(f"The livekit server will be started on port {server_port}.") light_server_port = os.getenv('AN_OPEN_PORT', server_port-1) + voice = False # The light server will NOT support voice. It will just run Open Interpreter. The Livekit server will handle voice server_thread = threading.Thread( target=start_server, @@ -136,9 +142,12 @@ def run( server_host, light_server_port, profile, + voice, debug ), ) + spinner.stop() + print("Starting server...") server_thread.start() threads.append(server_thread) @@ -164,7 +173,7 @@ def run( # Start the livekit worker worker_thread = threading.Thread( - target=run_command, args=("python worker.py dev",) # TODO: This should not be a CLI, it should just run the python file + target=run_command, args=("python source/server/livekit/worker.py dev",) # TODO: This should not be a CLI, it should just run the python file ) time.sleep(7) worker_thread.start() @@ -208,6 +217,8 @@ def run( ) client_thread = threading.Thread(target=module.run, args=[server_url, debug]) + spinner.stop() + print("Starting client...") client_thread.start() threads.append(client_thread) diff --git a/software/poetry.lock b/software/poetry.lock index bf367b0..0b1cc80 100644 --- a/software/poetry.lock +++ b/software/poetry.lock @@ -496,17 +496,6 @@ files = [ [package.dependencies] numpy = {version = ">=1.19.0", markers = "python_version >= \"3.9\""} -[[package]] -name = "bottle" -version = "0.12.25" -description = "Fast and simple WSGI-framework for small web-applications." -optional = false -python-versions = "*" -files = [ - {file = "bottle-0.12.25-py3-none-any.whl", hash = "sha256:d6f15f9d422670b7c073d63bd8d287b135388da187a0f3e3c19293626ce034ea"}, - {file = "bottle-0.12.25.tar.gz", hash = "sha256:e1a9c94970ae6d710b3fb4526294dfeb86f2cb4a81eff3a4b98dc40fb0e5e021"}, -] - [[package]] name = "cachetools" version = "5.5.0" @@ -763,20 +752,6 @@ azure = ["azure-storage-blob (>=12)"] gs = ["google-cloud-storage"] s3 = ["boto3"] -[[package]] -name = "clr-loader" -version = "0.2.6" -description = "Generic pure Python loader for .NET runtimes" -optional = false -python-versions = ">=3.7" -files = [ - {file = "clr_loader-0.2.6-py3-none-any.whl", hash = "sha256:79bbfee4bf6ac2f4836d89af2c39e0c32dce5d0c062596185aef380f317507a6"}, - {file = "clr_loader-0.2.6.tar.gz", hash = "sha256:019348ae6b6a83c7a406d14537c277cecf7a3a53b263ec342c81ded5845a67ee"}, -] - -[package.dependencies] -cffi = ">=1.13" - [[package]] name = "colorama" version = "0.4.6" @@ -1000,36 +975,36 @@ test = ["accelerate (>=0.20.0)", "torchvision (>=0.15.1)"] [[package]] name = "ctranslate2" -version = "4.1.0" +version = "4.3.1" description = "Fast inference engine for Transformer models" optional = false python-versions = ">=3.8" files = [ - {file = "ctranslate2-4.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8b3154cb8bfd4f320ee6dcc5ec2962c020a649eb2311e0edb90bc720f0eab529"}, - {file = "ctranslate2-4.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7e06cf9a0c6cf4e91c9edb1bdcb0c78fa9cd3fe5b18d7a380194e82f5881917c"}, - {file = "ctranslate2-4.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1d11878e72dce458e6d30ca6f06b4cfe92a6a0e6d271879de4208100cbac3fa5"}, - {file = "ctranslate2-4.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1793445fddf1ceea25aaaf7ebfa9adce6d774411fd0e3bae123355c71d122dff"}, - {file = "ctranslate2-4.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:f22a81138a106e42659d3c8b848f58ea813de8a7f17bf72e5aebbe3bf24cb5b4"}, - {file = "ctranslate2-4.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6b9580e3b494e8414a2a5f7733e029fd534b1b942ed657c27d413442299c661c"}, - {file = "ctranslate2-4.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:176e2856c2f312d85bc96358cf8c4ef7a377436d789942354b1c3a6d5d32cea1"}, - {file = "ctranslate2-4.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32d8a25fe1854454f236f02caf3ca819ac63fd3b7f6edd9c1b7dc7f4998451de"}, - {file = "ctranslate2-4.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02f92e2831b0ab04d5d2e6e1139f4035db859e7f99fe6a98fb9a385d99ce2d70"}, - {file = "ctranslate2-4.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:829fd5d8bfc907e48312072aae4470093731c3714e4248abf5034942f240742e"}, - {file = "ctranslate2-4.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:48a520d6cc89f7bac491a1445ca1c5d2afd0dd40793ac1970d05d89df0064184"}, - {file = "ctranslate2-4.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4bdbe25946526ad2f4ed4b9ab23779d106c91d80e996222617e212aa493c13be"}, - {file = "ctranslate2-4.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:040a33118c4b24e6bc73726491785c5f24ac365c1878b04f8772684eaa54ef68"}, - {file = "ctranslate2-4.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e43be4745ae2c87f9863b7006169b81983749041f085ee62c8858e739f25c255"}, - {file = "ctranslate2-4.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:737610101b852ffaefa8f5534645cbbbe53d165faa78d587a732ad35ab815f88"}, - {file = "ctranslate2-4.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:df3d923460e92f8641bd76e0592d208bff763226e52ac79e2a7d77ef714bffb1"}, - {file = "ctranslate2-4.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:cafa26a6c55d31081544cacbe222ab0226369ae8a35ddf14b493078601f6825a"}, - {file = "ctranslate2-4.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:190ef37ec43daf56abaf5103d8d9b425cef2d2aa80d5e7dd900d2e30157c5fad"}, - {file = "ctranslate2-4.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65e44dfcd88376310fc2b51561550600ec444f62ce9b3469d0e1f98d71d5e844"}, - {file = "ctranslate2-4.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:72b4719997ee2693fff8d00e0d63046595a5e293c2412941115afdc55791a92f"}, - {file = "ctranslate2-4.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:411c84d8858b65dd05595c659a89a405426bd824788132139f474c48998255de"}, - {file = "ctranslate2-4.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2f70b696cbb222cb1dfa3b4ab1a385c1c801df7f250ede63b2d30bdb0471f1a5"}, - {file = "ctranslate2-4.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf9f23e7a354024758ff503a9c287986b0b49b363695438ce90056f390f0ac9d"}, - {file = "ctranslate2-4.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8964d97936950ecd9ec0bc90ceb503156c704901287d52b0129df9d6800dd81b"}, - {file = "ctranslate2-4.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:4adaeb8fdaa61b881b8e91bbc01b580b07b6581facc3de01844a09dad704b31e"}, + {file = "ctranslate2-4.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e962c9dc3ddfacf60f2467bea5f91f75239c3d9c17656e4b0c569d956d662b99"}, + {file = "ctranslate2-4.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:49a0d9136d577b667c1bb450267248d9cf205b5eb28b89b3f70c296ec5285da8"}, + {file = "ctranslate2-4.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:343b24fe3d8a5b6a7c8082332415767bef7ceaf15bb43d0cec7e83665108c51e"}, + {file = "ctranslate2-4.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d95ecb440e4985cad4623a1fe7bb91406bab4aa55b00aa89a0c16eb5939d640"}, + {file = "ctranslate2-4.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:febf7cf0fb641c76035cdece58e97d27f4e8950a5e32fc480f9afa1bcbbb856c"}, + {file = "ctranslate2-4.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a49dc5d339e2f4ed016553db0d0e6cbd369742697c87c6cc0cc15a47c7c72d00"}, + {file = "ctranslate2-4.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:def98f6f8900470b2cec9408e5b0402af75f40f771391ebacd2b60666b8d75b9"}, + {file = "ctranslate2-4.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:30c02fcd5a7be93bf42a8adf81a9ac4f394e23bd639192907b2e11feae589971"}, + {file = "ctranslate2-4.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a06043910a7dee91ea03634be2cff2e1338a9f87bb51e062c03bae69e2c826b6"}, + {file = "ctranslate2-4.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:6f49834b63848f17dfdc1b2b8c632c31932ad69e130ce0f7b1e2505aa3923e6c"}, + {file = "ctranslate2-4.3.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:fcf649d976070ddd33cdda00a7a60fde6f1fbe27d65d2c6141dd95153f965f01"}, + {file = "ctranslate2-4.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f63f779f1d4518acdc694b1938887d4f28613ac2dfe507ccc2c0d56dd8c95b40"}, + {file = "ctranslate2-4.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68301fbc5fb7daa609eb12ca6c2ed8aa29852c20f962532317762d1889e751d9"}, + {file = "ctranslate2-4.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45c5b352783bd3806f0c9f5dcbfa49d89c0dde71cb7d1b1c527c525e85af3ded"}, + {file = "ctranslate2-4.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:08626f115d5a39c56a666680735d6eebfc4d8a215288896d4d8afc14cfcdcffe"}, + {file = "ctranslate2-4.3.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:e40d43c5f7d25f40d31cca0541cf21c2846f89509b99189d340fdee595391196"}, + {file = "ctranslate2-4.3.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f352bcb802ab9ff1b94a25b4915c4f9f97cdd230993cf45ea290592d8997c2e2"}, + {file = "ctranslate2-4.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c202011fa2ebb8129ba98a65df48df075f0ef53f905f2b13b8cd00f31c7ccff"}, + {file = "ctranslate2-4.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4bca2ce519c497bc2f79e567093609d7bdfaff3313220e0d831797288803f3aa"}, + {file = "ctranslate2-4.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:ef812a4129e877f64f8ca2438b6247060af0f053a56b438dbfa81dae9ca12675"}, + {file = "ctranslate2-4.3.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d8679354547260db999c2bcc6f11a31dad828c3d896d6120045bd0333940732f"}, + {file = "ctranslate2-4.3.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:60bc176dd2e0ee6ddd33682401440f7626d115fed4f1e5e6816d9f7f213d1a62"}, + {file = "ctranslate2-4.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7d394367fe472b6540489e3b081fc7e17cea2264075b074fb28eca30ff63463f"}, + {file = "ctranslate2-4.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f1fd426d9019198d0fd8f37a18bf9c486241f711d597686956c58cd7676d564"}, + {file = "ctranslate2-4.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:de05e33790d72492a76101a0357c3d87d97ad53af84417c78f45e85df76d39e8"}, ] [package.dependencies] @@ -1489,13 +1464,13 @@ standard = ["uvicorn[standard] (>=0.15.0)"] [[package]] name = "faster-whisper" -version = "1.0.2" +version = "1.0.3" description = "Faster Whisper transcription with CTranslate2" optional = false python-versions = ">=3.8" files = [ - {file = "faster-whisper-1.0.2.tar.gz", hash = "sha256:54d9fc698f7c665e00a0d5ed65d6e975b72a8862b8214f20a22e79b115c41511"}, - {file = "faster_whisper-1.0.2-py3-none-any.whl", hash = "sha256:d968c289222e766a49ed97eecec24e934bdef405183f57d6d434a364bb3569c1"}, + {file = "faster-whisper-1.0.3.tar.gz", hash = "sha256:1a145db86450b56aaa623c8df7d4ef86e8a1159900f60533e2890e98e8453a17"}, + {file = "faster_whisper-1.0.3-py3-none-any.whl", hash = "sha256:364d0e378ab232ed26f39656e5c98548b38045224e206b20f7d8c90e2745b9d3"}, ] [package.dependencies] @@ -1509,23 +1484,6 @@ tokenizers = ">=0.13,<1" conversion = ["transformers[torch] (>=4.23)"] dev = ["black (==23.*)", "flake8 (==6.*)", "isort (==5.*)", "pytest (==7.*)"] -[[package]] -name = "ffmpeg-python" -version = "0.2.0" -description = "Python bindings for FFmpeg - with complex filtering support" -optional = false -python-versions = "*" -files = [ - {file = "ffmpeg-python-0.2.0.tar.gz", hash = "sha256:65225db34627c578ef0e11c8b1eb528bb35e024752f6f10b78c011f6f64c4127"}, - {file = "ffmpeg_python-0.2.0-py3-none-any.whl", hash = "sha256:ac441a0404e053f8b6a1113a77c0f452f1cfc62f6344a769475ffdc0f56c23c5"}, -] - -[package.dependencies] -future = "*" - -[package.extras] -dev = ["Sphinx (==2.1.0)", "future (==0.17.1)", "numpy (==1.16.4)", "pytest (==4.6.1)", "pytest-mock (==1.10.4)", "tox (==3.12.1)"] - [[package]] name = "filelock" version = "3.15.4" @@ -1746,17 +1704,6 @@ test-downstream = ["aiobotocore (>=2.5.4,<3.0.0)", "dask-expr", "dask[dataframe, test-full = ["adlfs", "aiohttp (!=4.0.0a0,!=4.0.0a1)", "cloudpickle", "dask", "distributed", "dropbox", "dropboxdrivefs", "fastparquet", "fusepy", "gcsfs", "jinja2", "kerchunk", "libarchive-c", "lz4", "notebook", "numpy", "ocifs", "pandas", "panel", "paramiko", "pyarrow", "pyarrow (>=1)", "pyftpdlib", "pygit2", "pytest", "pytest-asyncio (!=0.22.0)", "pytest-benchmark", "pytest-cov", "pytest-mock", "pytest-recording", "pytest-rerunfailures", "python-snappy", "requests", "smbprotocol", "tqdm", "urllib3", "zarr", "zstandard"] tqdm = ["tqdm"] -[[package]] -name = "future" -version = "1.0.0" -description = "Clean single-source support for Python 3 and 2" -optional = false -python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*" -files = [ - {file = "future-1.0.0-py3-none-any.whl", hash = "sha256:929292d34f5872e70396626ef385ec22355a1fae8ad29e1a734c3e43f9fbc216"}, - {file = "future-1.0.0.tar.gz", hash = "sha256:bd2968309307861edae1458a4f8a4f3598c03be43b97521076aebf5d94c07b05"}, -] - [[package]] name = "git-python" version = "1.0.3" @@ -1947,82 +1894,63 @@ protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4 [package.extras] grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"] -[[package]] -name = "groq" -version = "0.5.0" -description = "The official Python library for the groq API" -optional = false -python-versions = ">=3.7" -files = [ - {file = "groq-0.5.0-py3-none-any.whl", hash = "sha256:a7e6be1118bcdfea3ed071ec00f505a34d4e6ec28c435adb5a5afd33545683a1"}, - {file = "groq-0.5.0.tar.gz", hash = "sha256:d476cdc3383b45d2a4dc1876142a9542e663ea1029f9e07a05de24f895cae48c"}, -] - -[package.dependencies] -anyio = ">=3.5.0,<5" -distro = ">=1.7.0,<2" -httpx = ">=0.23.0,<1" -pydantic = ">=1.9.0,<3" -sniffio = "*" -typing-extensions = ">=4.7,<5" - [[package]] name = "grpcio" -version = "1.65.5" +version = "1.66.0" description = "HTTP/2-based RPC framework" optional = false python-versions = ">=3.8" files = [ - {file = "grpcio-1.65.5-cp310-cp310-linux_armv7l.whl", hash = "sha256:b67d450f1e008fedcd81e097a3a400a711d8be1a8b20f852a7b8a73fead50fe3"}, - {file = "grpcio-1.65.5-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:a70a20eed87bba647a38bedd93b3ce7db64b3f0e8e0952315237f7f5ca97b02d"}, - {file = "grpcio-1.65.5-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:f79c87c114bf37adf408026b9e2e333fe9ff31dfc9648f6f80776c513145c813"}, - {file = "grpcio-1.65.5-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f17f9fa2d947dbfaca01b3ab2c62eefa8240131fdc67b924eb42ce6032e3e5c1"}, - {file = "grpcio-1.65.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:32d60e18ff7c34fe3f6db3d35ad5c6dc99f5b43ff3982cb26fad4174462d10b1"}, - {file = "grpcio-1.65.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fe6505376f5b00bb008e4e1418152e3ad3d954b629da286c7913ff3cfc0ff740"}, - {file = "grpcio-1.65.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:33158e56c6378063923c417e9fbdb28660b6e0e2835af42e67f5a7793f587af7"}, - {file = "grpcio-1.65.5-cp310-cp310-win32.whl", hash = "sha256:1cbc208edb9acf1cc339396a1a36b83796939be52f34e591c90292045b579fbf"}, - {file = "grpcio-1.65.5-cp310-cp310-win_amd64.whl", hash = "sha256:bc74f3f745c37e2c5685c9d2a2d5a94de00f286963f5213f763ae137bf4f2358"}, - {file = "grpcio-1.65.5-cp311-cp311-linux_armv7l.whl", hash = "sha256:3207ae60d07e5282c134b6e02f9271a2cb523c6d7a346c6315211fe2bf8d61ed"}, - {file = "grpcio-1.65.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:a2f80510f99f82d4eb825849c486df703f50652cea21c189eacc2b84f2bde764"}, - {file = "grpcio-1.65.5-cp311-cp311-manylinux_2_17_aarch64.whl", hash = "sha256:a80e9a5e3f93c54f5eb82a3825ea1fc4965b2fa0026db2abfecb139a5c4ecdf1"}, - {file = "grpcio-1.65.5-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0b2944390a496567de9e70418f3742b477d85d8ca065afa90432edc91b4bb8ad"}, - {file = "grpcio-1.65.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3655139d7be213c32c79ef6fb2367cae28e56ef68e39b1961c43214b457f257"}, - {file = "grpcio-1.65.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:05f02d68fc720e085f061b704ee653b181e6d5abfe315daef085719728d3d1fd"}, - {file = "grpcio-1.65.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1c4caafe71aef4dabf53274bbf4affd6df651e9f80beedd6b8e08ff438ed3260"}, - {file = "grpcio-1.65.5-cp311-cp311-win32.whl", hash = "sha256:84c901cdec16a092099f251ef3360d15e29ef59772150fa261d94573612539b5"}, - {file = "grpcio-1.65.5-cp311-cp311-win_amd64.whl", hash = "sha256:11f8b16121768c1cb99d7dcb84e01510e60e6a206bf9123e134118802486f035"}, - {file = "grpcio-1.65.5-cp312-cp312-linux_armv7l.whl", hash = "sha256:ee6ed64a27588a2c94e8fa84fe8f3b5c89427d4d69c37690903d428ec61ca7e4"}, - {file = "grpcio-1.65.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:76991b7a6fb98630a3328839755181ce7c1aa2b1842aa085fd4198f0e5198960"}, - {file = "grpcio-1.65.5-cp312-cp312-manylinux_2_17_aarch64.whl", hash = "sha256:89c00a18801b1ed9cc441e29b521c354725d4af38c127981f2c950c796a09b6e"}, - {file = "grpcio-1.65.5-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:078038e150a897e5e402ed3d57f1d31ebf604cbed80f595bd281b5da40762a92"}, - {file = "grpcio-1.65.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c97962720489ef31b5ad8a916e22bc31bba3664e063fb9f6702dce056d4aa61b"}, - {file = "grpcio-1.65.5-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:b8270b15b99781461b244f5c81d5c2bc9696ab9189fb5ff86c841417fb3b39fe"}, - {file = "grpcio-1.65.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8e5c4c15ac3fe1eb68e46bc51e66ad29be887479f231f8237cf8416058bf0cc1"}, - {file = "grpcio-1.65.5-cp312-cp312-win32.whl", hash = "sha256:f5b5970341359341d0e4c789da7568264b2a89cd976c05ea476036852b5950cd"}, - {file = "grpcio-1.65.5-cp312-cp312-win_amd64.whl", hash = "sha256:238a625f391a1b9f5f069bdc5930f4fd71b74426bea52196fc7b83f51fa97d34"}, - {file = "grpcio-1.65.5-cp38-cp38-linux_armv7l.whl", hash = "sha256:6c4e62bcf297a1568f627f39576dbfc27f1e5338a691c6dd5dd6b3979da51d1c"}, - {file = "grpcio-1.65.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:d7df567b67d16d4177835a68d3f767bbcbad04da9dfb52cbd19171f430c898bd"}, - {file = "grpcio-1.65.5-cp38-cp38-manylinux_2_17_aarch64.whl", hash = "sha256:b7ca419f1462390851eec395b2089aad1e49546b52d4e2c972ceb76da69b10f8"}, - {file = "grpcio-1.65.5-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fa36dd8496d3af0d40165252a669fa4f6fd2db4b4026b9a9411cbf060b9d6a15"}, - {file = "grpcio-1.65.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a101696f9ece90a0829988ff72f1b1ea2358f3df035bdf6d675dd8b60c2c0894"}, - {file = "grpcio-1.65.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:2a6d8169812932feac514b420daffae8ab8e36f90f3122b94ae767e633296b17"}, - {file = "grpcio-1.65.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:47d0aaaab82823f0aa6adea5184350b46e2252e13a42a942db84da5b733f2e05"}, - {file = "grpcio-1.65.5-cp38-cp38-win32.whl", hash = "sha256:85ae8f8517d5bcc21fb07dbf791e94ed84cc28f84c903cdc2bd7eaeb437c8f45"}, - {file = "grpcio-1.65.5-cp38-cp38-win_amd64.whl", hash = "sha256:770bd4bd721961f6dd8049bc27338564ba8739913f77c0f381a9815e465ff965"}, - {file = "grpcio-1.65.5-cp39-cp39-linux_armv7l.whl", hash = "sha256:ab5ec837d8cee8dbce9ef6386125f119b231e4333cc6b6d57b6c5c7c82a72331"}, - {file = "grpcio-1.65.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:cabd706183ee08d8026a015af5819a0b3a8959bdc9d1f6fdacd1810f09200f2a"}, - {file = "grpcio-1.65.5-cp39-cp39-manylinux_2_17_aarch64.whl", hash = "sha256:ec71fc5b39821ad7d80db7473c8f8c2910f3382f0ddadfbcfc2c6c437107eb67"}, - {file = "grpcio-1.65.5-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d3a9e35bcb045e39d7cac30464c285389b9a816ac2067e4884ad2c02e709ef8e"}, - {file = "grpcio-1.65.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d750e9330eb14236ca11b78d0c494eed13d6a95eb55472298f0e547c165ee324"}, - {file = "grpcio-1.65.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2b91ce647b6307f25650872454a4d02a2801f26a475f90d0b91ed8110baae589"}, - {file = "grpcio-1.65.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8da58ff80bc4556cf29bc03f5fff1f03b8387d6aaa7b852af9eb65b2cf833be4"}, - {file = "grpcio-1.65.5-cp39-cp39-win32.whl", hash = "sha256:7a412959aa5f08c5ac04aa7b7c3c041f5e4298cadd4fcc2acff195b56d185ebc"}, - {file = "grpcio-1.65.5-cp39-cp39-win_amd64.whl", hash = "sha256:55714ea852396ec9568f45f487639945ab674de83c12bea19d5ddbc3ae41ada3"}, - {file = "grpcio-1.65.5.tar.gz", hash = "sha256:ec6f219fb5d677a522b0deaf43cea6697b16f338cb68d009e30930c4aa0d2209"}, -] - -[package.extras] -protobuf = ["grpcio-tools (>=1.65.5)"] + {file = "grpcio-1.66.0-cp310-cp310-linux_armv7l.whl", hash = "sha256:ad7256f224437b2c29c2bef98ddd3130454c5b1ab1f0471fc11794cefd4dbd3d"}, + {file = "grpcio-1.66.0-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:5f4b3357e59dfba9140a51597287297bc638710d6a163f99ee14efc19967a821"}, + {file = "grpcio-1.66.0-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:e8d20308eeae15b3e182f47876f05acbdec1eebd9473a9814a44e46ec4a84c04"}, + {file = "grpcio-1.66.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1eb03524d0f55b965d6c86aa44e5db9e5eaa15f9ed3b164621e652e5b927f4b8"}, + {file = "grpcio-1.66.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:37514b68a42e9cf24536345d3cf9e580ffd29117c158b4eeea34625200256067"}, + {file = "grpcio-1.66.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:516fdbc8e156db71a004bc431a6303bca24cfde186babe96dde7bd01e8f0cc70"}, + {file = "grpcio-1.66.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d0439a970d65327de21c299ea0e0c2ad0987cdaf18ba5066621dea5f427f922b"}, + {file = "grpcio-1.66.0-cp310-cp310-win32.whl", hash = "sha256:5f93fc84b72bbc7b84a42f3ca9dc055fa00d2303d9803be011ebf7a10a4eb833"}, + {file = "grpcio-1.66.0-cp310-cp310-win_amd64.whl", hash = "sha256:8fc5c710ddd51b5a0dc36ef1b6663430aa620e0ce029b87b150dafd313b978c3"}, + {file = "grpcio-1.66.0-cp311-cp311-linux_armv7l.whl", hash = "sha256:dd614370e939f9fceeeb2915111a0795271b4c11dfb5fc0f58449bee40c726a5"}, + {file = "grpcio-1.66.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:245b08f9b3c645a6a623f3ed4fa43dcfcd6ad701eb9c32511c1bb7380e8c3d23"}, + {file = "grpcio-1.66.0-cp311-cp311-manylinux_2_17_aarch64.whl", hash = "sha256:aaf30c75cbaf30e561ca45f21eb1f729f0fab3f15c592c1074795ed43e3ff96f"}, + {file = "grpcio-1.66.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49234580a073ce7ac490112f6c67c874cbcb27804c4525978cdb21ba7f3f193c"}, + {file = "grpcio-1.66.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de9e20a0acb709dcfa15a622c91f584f12c9739a79c47999f73435d2b3cc8a3b"}, + {file = "grpcio-1.66.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:bc008c6afa1e7c8df99bd9154abc4f0470d26b7730ca2521122e99e771baa8c7"}, + {file = "grpcio-1.66.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:50cea8ce2552865b87e3dffbb85eb21e6b98d928621600c0feda2f02449cd837"}, + {file = "grpcio-1.66.0-cp311-cp311-win32.whl", hash = "sha256:508411df1f2b7cfa05d4d7dbf3d576fe4f949cd61c03f3a6f0378c84e3d7b963"}, + {file = "grpcio-1.66.0-cp311-cp311-win_amd64.whl", hash = "sha256:6d586a95c05c82a5354be48bb4537e1accaf2472d8eb7e9086d844cbff934482"}, + {file = "grpcio-1.66.0-cp312-cp312-linux_armv7l.whl", hash = "sha256:5ea27f4ce8c0daccfdd2c7961e6ba404b6599f47c948415c4cca5728739107a3"}, + {file = "grpcio-1.66.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:296a45ea835e12a1cc35ab0c57e455346c272af7b0d178e29c67742167262b4c"}, + {file = "grpcio-1.66.0-cp312-cp312-manylinux_2_17_aarch64.whl", hash = "sha256:e36fa838ac1d6c87198ca149cbfcc92e1af06bb8c8cd852622f8e58f33ea3324"}, + {file = "grpcio-1.66.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:684a4c07883cbd4ac864f0d08d927267404f5f0c76f31c85f9bbe05f2daae2f2"}, + {file = "grpcio-1.66.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3084e590e857ba7585ae91078e4c9b6ef55aaf1dc343ce26400ba59a146eada"}, + {file = "grpcio-1.66.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:526d4f6ca19f31b25606d5c470ecba55c0b22707b524e4de8987919e8920437d"}, + {file = "grpcio-1.66.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:423ae18637cd99ddcf2e5a6851c61828c49e9b9d022d0442d979b4f230109787"}, + {file = "grpcio-1.66.0-cp312-cp312-win32.whl", hash = "sha256:7bc9d823e05d63a87511fb456dcc48dc0fced86c282bf60229675e7ee7aac1a1"}, + {file = "grpcio-1.66.0-cp312-cp312-win_amd64.whl", hash = "sha256:230cdd696751e7eb1395718cd308234749daa217bb8d128f00357dc4df102558"}, + {file = "grpcio-1.66.0-cp38-cp38-linux_armv7l.whl", hash = "sha256:0f3010bf46b2a01c9e40644cb9ed91b4b8435e5c500a275da5f9f62580e31e80"}, + {file = "grpcio-1.66.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ba18cfdc09312eb2eea6fa0ce5d2eec3cf345ea78f6528b2eaed6432105e0bd0"}, + {file = "grpcio-1.66.0-cp38-cp38-manylinux_2_17_aarch64.whl", hash = "sha256:53d4c6706b49e358a2a33345dbe9b6b3bb047cecd7e8c07ba383bd09349bfef8"}, + {file = "grpcio-1.66.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:643d8d9632a688ae69661e924b862e23c83a3575b24e52917ec5bcc59543d212"}, + {file = "grpcio-1.66.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba60ae3b465b3e85080ae3bfbc36fd0305ae495ab16fcf8022fc7d7a23aac846"}, + {file = "grpcio-1.66.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:9d5251578767fe44602688c851c2373b5513048ac84c21a0fe946590a8e7933d"}, + {file = "grpcio-1.66.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5e8140b39f10d7be2263afa2838112de29374c5c740eb0afd99146cb5bdbd990"}, + {file = "grpcio-1.66.0-cp38-cp38-win32.whl", hash = "sha256:5b15ef1b296c4e78f15f64fc65bf8081f8774480ffcac45642f69d9d753d9c6b"}, + {file = "grpcio-1.66.0-cp38-cp38-win_amd64.whl", hash = "sha256:c072f90a1f0409f827ae86266984cba65e89c5831a0726b9fc7f4b5fb940b853"}, + {file = "grpcio-1.66.0-cp39-cp39-linux_armv7l.whl", hash = "sha256:a639d3866bfb5a678b5c0b92cd7ab543033ed8988854290fd86145e71731fd4c"}, + {file = "grpcio-1.66.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:6ed35bf7da3fb3b1949e32bdf47a8b5ffe0aed11722d948933bd068531cd4682"}, + {file = "grpcio-1.66.0-cp39-cp39-manylinux_2_17_aarch64.whl", hash = "sha256:1c5466222470cb7fbc9cc898af1d48eefd297cb2e2f59af6d4a851c862fa90ac"}, + {file = "grpcio-1.66.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:921b8f7f25d5300d7c6837a1e0639ef145fbdbfb728e0a5db2dbccc9fc0fd891"}, + {file = "grpcio-1.66.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3f6feb0dc8456d025e566709f7dd02885add99bedaac50229013069242a1bfd"}, + {file = "grpcio-1.66.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:748452dbd5a047475d5413bdef08b0b9ceb2c0c0e249d4ee905a5fb82c6328dc"}, + {file = "grpcio-1.66.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:832945e64176520520317b50d64ec7d79924429528d5747669b52d0bf2c7bd78"}, + {file = "grpcio-1.66.0-cp39-cp39-win32.whl", hash = "sha256:8096a922eb91bc97c839f675c3efa1257c6ef181ae1b25d3fb97f2cae4c57c01"}, + {file = "grpcio-1.66.0-cp39-cp39-win_amd64.whl", hash = "sha256:375b58892301a5fc6ca7d7ff689c9dc9d00895f5d560604ace9f4f0573013c63"}, + {file = "grpcio-1.66.0.tar.gz", hash = "sha256:c1ea4c528e7db6660718e4165fd1b5ac24b79a70c870a7bc0b7bdb9babab7c1e"}, +] + +[package.extras] +protobuf = ["grpcio-tools (>=1.66.0)"] [[package]] name = "grpcio-status" @@ -2370,13 +2298,13 @@ license = ["ukkonen"] [[package]] name = "idna" -version = "3.7" +version = "3.8" description = "Internationalized Domain Names in Applications (IDNA)" optional = false -python-versions = ">=3.5" +python-versions = ">=3.6" files = [ - {file = "idna-3.7-py3-none-any.whl", hash = "sha256:82fee1fc78add43492d3a1898bfa6d8a904cc97d8427f683ed8e798d07761aa0"}, - {file = "idna-3.7.tar.gz", hash = "sha256:028ff3aadf0609c1fd278d8ea3089299412a7a8b9bd005dd08b9f8285bcb5cfc"}, + {file = "idna-3.8-py3-none-any.whl", hash = "sha256:050b4e5baadcd44d760cedbd2b8e639f2ff89bbc7a5730fcc662954303377aac"}, + {file = "idna-3.8.tar.gz", hash = "sha256:d838c2c0ed6fced7693d5e8ab8e734d5f8fda53a039c0164afb0b82e771e3603"}, ] [[package]] @@ -2398,28 +2326,6 @@ doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linke perf = ["ipython"] test = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6,!=8.1.*)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy", "pytest-perf (>=0.9.2)", "pytest-ruff (>=0.2.1)"] -[[package]] -name = "importlib-resources" -version = "6.4.4" -description = "Read resources from Python packages" -optional = false -python-versions = ">=3.8" -files = [ - {file = "importlib_resources-6.4.4-py3-none-any.whl", hash = "sha256:dda242603d1c9cd836c3368b1174ed74cb4049ecd209e7a1a0104620c18c5c11"}, - {file = "importlib_resources-6.4.4.tar.gz", hash = "sha256:20600c8b7361938dc0bb2d5ec0297802e575df486f5a544fa414da65e13721f7"}, -] - -[package.dependencies] -zipp = {version = ">=3.1.0", markers = "python_version < \"3.10\""} - -[package.extras] -check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"] -cover = ["pytest-cov"] -doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] -enabler = ["pytest-enabler (>=2.2)"] -test = ["jaraco.test (>=5.4)", "pytest (>=6,!=8.1.*)", "zipp (>=3.17)"] -type = ["pytest-mypy"] - [[package]] name = "inflect" version = "7.3.1" @@ -2515,13 +2421,13 @@ test = ["flaky", "ipyparallel", "pre-commit", "pytest (>=7.0)", "pytest-asyncio [[package]] name = "ipython" -version = "8.18.1" +version = "8.26.0" description = "IPython: Productive Interactive Computing" optional = false -python-versions = ">=3.9" +python-versions = ">=3.10" files = [ - {file = "ipython-8.18.1-py3-none-any.whl", hash = "sha256:e8267419d72d81955ec1177f8a29aaa90ac80ad647499201119e2f05e99aa397"}, - {file = "ipython-8.18.1.tar.gz", hash = "sha256:ca6f079bb33457c66e233e4580ebfc4128855b4cf6370dddd73842a9563e8a27"}, + {file = "ipython-8.26.0-py3-none-any.whl", hash = "sha256:e6b347c27bdf9c32ee9d31ae85defc525755a1869f14057e900675b9e8d6e6ff"}, + {file = "ipython-8.26.0.tar.gz", hash = "sha256:1cec0fbba8404af13facebe83d04436a7434c7400e59f47acf467c64abd0956c"}, ] [package.dependencies] @@ -2530,43 +2436,44 @@ decorator = "*" exceptiongroup = {version = "*", markers = "python_version < \"3.11\""} jedi = ">=0.16" matplotlib-inline = "*" -pexpect = {version = ">4.3", markers = "sys_platform != \"win32\""} +pexpect = {version = ">4.3", markers = "sys_platform != \"win32\" and sys_platform != \"emscripten\""} prompt-toolkit = ">=3.0.41,<3.1.0" pygments = ">=2.4.0" stack-data = "*" -traitlets = ">=5" -typing-extensions = {version = "*", markers = "python_version < \"3.10\""} +traitlets = ">=5.13.0" +typing-extensions = {version = ">=4.6", markers = "python_version < \"3.12\""} [package.extras] -all = ["black", "curio", "docrepr", "exceptiongroup", "ipykernel", "ipyparallel", "ipywidgets", "matplotlib", "matplotlib (!=3.2.0)", "nbconvert", "nbformat", "notebook", "numpy (>=1.22)", "pandas", "pickleshare", "pytest (<7)", "pytest (<7.1)", "pytest-asyncio (<0.22)", "qtconsole", "setuptools (>=18.5)", "sphinx (>=1.3)", "sphinx-rtd-theme", "stack-data", "testpath", "trio", "typing-extensions"] +all = ["ipython[black,doc,kernel,matplotlib,nbconvert,nbformat,notebook,parallel,qtconsole]", "ipython[test,test-extra]"] black = ["black"] -doc = ["docrepr", "exceptiongroup", "ipykernel", "matplotlib", "pickleshare", "pytest (<7)", "pytest (<7.1)", "pytest-asyncio (<0.22)", "setuptools (>=18.5)", "sphinx (>=1.3)", "sphinx-rtd-theme", "stack-data", "testpath", "typing-extensions"] +doc = ["docrepr", "exceptiongroup", "intersphinx-registry", "ipykernel", "ipython[test]", "matplotlib", "setuptools (>=18.5)", "sphinx (>=1.3)", "sphinx-rtd-theme", "sphinxcontrib-jquery", "tomli", "typing-extensions"] kernel = ["ipykernel"] +matplotlib = ["matplotlib"] nbconvert = ["nbconvert"] nbformat = ["nbformat"] notebook = ["ipywidgets", "notebook"] parallel = ["ipyparallel"] qtconsole = ["qtconsole"] -test = ["pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath"] -test-extra = ["curio", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.22)", "pandas", "pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath", "trio"] +test = ["packaging", "pickleshare", "pytest", "pytest-asyncio (<0.22)", "testpath"] +test-extra = ["curio", "ipython[test]", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.23)", "pandas", "trio"] [[package]] name = "ipywidgets" -version = "8.1.3" +version = "8.1.5" description = "Jupyter interactive widgets" optional = false python-versions = ">=3.7" files = [ - {file = "ipywidgets-8.1.3-py3-none-any.whl", hash = "sha256:efafd18f7a142248f7cb0ba890a68b96abd4d6e88ddbda483c9130d12667eaf2"}, - {file = "ipywidgets-8.1.3.tar.gz", hash = "sha256:f5f9eeaae082b1823ce9eac2575272952f40d748893972956dc09700a6392d9c"}, + {file = "ipywidgets-8.1.5-py3-none-any.whl", hash = "sha256:3290f526f87ae6e77655555baba4f36681c555b8bdbbff430b70e52c34c86245"}, + {file = "ipywidgets-8.1.5.tar.gz", hash = "sha256:870e43b1a35656a80c18c9503bbf2d16802db1cb487eec6fab27d683381dde17"}, ] [package.dependencies] comm = ">=0.1.3" ipython = ">=6.1.0" -jupyterlab-widgets = ">=3.0.11,<3.1.0" +jupyterlab-widgets = ">=3.0.12,<3.1.0" traitlets = ">=4.3.1" -widgetsnbextension = ">=4.0.11,<4.1.0" +widgetsnbextension = ">=4.0.12,<4.1.0" [package.extras] test = ["ipykernel", "jsonschema", "pytest (>=3.6.0)", "pytest-cov", "pytz"] @@ -2721,7 +2628,6 @@ files = [ ] [package.dependencies] -importlib-metadata = {version = ">=4.8.3", markers = "python_version < \"3.10\""} jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0" python-dateutil = ">=2.8.2" pyzmq = ">=23.0" @@ -2754,29 +2660,15 @@ test = ["ipykernel", "pre-commit", "pytest (<8)", "pytest-cov", "pytest-timeout" [[package]] name = "jupyterlab-widgets" -version = "3.0.11" +version = "3.0.13" description = "Jupyter interactive widgets for JupyterLab" optional = false python-versions = ">=3.7" files = [ - {file = "jupyterlab_widgets-3.0.11-py3-none-any.whl", hash = "sha256:78287fd86d20744ace330a61625024cf5521e1c012a352ddc0a3cdc2348becd0"}, - {file = "jupyterlab_widgets-3.0.11.tar.gz", hash = "sha256:dd5ac679593c969af29c9bed054c24f26842baa51352114736756bc035deee27"}, -] - -[[package]] -name = "keyboard" -version = "0.13.5" -description = "Hook and simulate keyboard events on Windows and Linux" -optional = false -python-versions = "*" -files = [ - {file = "keyboard-0.13.5-py3-none-any.whl", hash = "sha256:8e9c2422f1217e0bd84489b9ecd361027cc78415828f4fe4f88dd4acd587947b"}, - {file = "keyboard-0.13.5.zip", hash = "sha256:63ed83305955939ca5c9a73755e5cc43e8242263f5ad5fd3bb7e0b032f3d308b"}, + {file = "jupyterlab_widgets-3.0.13-py3-none-any.whl", hash = "sha256:e3cda2c233ce144192f1e29914ad522b2f4c40e77214b0cc97377ca3d323db54"}, + {file = "jupyterlab_widgets-3.0.13.tar.gz", hash = "sha256:a2966d385328c1942b683a8cd96b89b8dd82c8b8f81dda902bb2bc06d46f5bed"}, ] -[package.dependencies] -pyobjc = {version = "*", markers = "sys_platform == \"darwin\""} - [[package]] name = "kiwisolver" version = "1.4.5" @@ -2976,26 +2868,6 @@ display = ["matplotlib (>=3.5.0)"] docs = ["ipython (>=7.0)", "matplotlib (>=3.5.0)", "mir-eval (>=0.5)", "numba (>=0.51)", "numpydoc", "presets", "sphinx (!=1.3.1)", "sphinx-copybutton (>=0.5.2)", "sphinx-gallery (>=0.7)", "sphinx-multiversion (>=0.2.3)", "sphinx-rtd-theme (>=1.2.0)", "sphinxcontrib-svg2pdfconverter"] tests = ["matplotlib (>=3.5.0)", "packaging (>=20.0)", "pytest", "pytest-cov", "pytest-mpl", "resampy (>=0.2.2)", "samplerate", "types-decorator"] -[[package]] -name = "linkify-it-py" -version = "2.0.3" -description = "Links recognition library with FULL unicode support." -optional = false -python-versions = ">=3.7" -files = [ - {file = "linkify-it-py-2.0.3.tar.gz", hash = "sha256:68cda27e162e9215c17d786649d1da0021a451bdc436ef9e0fa0ba5234b9b048"}, - {file = "linkify_it_py-2.0.3-py3-none-any.whl", hash = "sha256:6bcbc417b0ac14323382aef5c5192c0075bf8a9d6b41820a2b66371eac6b6d79"}, -] - -[package.dependencies] -uc-micro-py = "*" - -[package.extras] -benchmark = ["pytest", "pytest-benchmark"] -dev = ["black", "flake8", "isort", "pre-commit", "pyproject-flake8"] -doc = ["myst-parser", "sphinx", "sphinx-book-theme"] -test = ["coverage", "pytest", "pytest-cov"] - [[package]] name = "litellm" version = "1.43.1" @@ -3045,13 +2917,12 @@ types-protobuf = ">=3" [[package]] name = "livekit-agents" -version = "0.8.6" +version = "0.8.7" description = "LiveKit Python Agents" optional = false python-versions = ">=3.9.0" files = [ - {file = "livekit_agents-0.8.6-py3-none-any.whl", hash = "sha256:398d0d0a80696287c47f41bff2c5e58a176f8c6b63bd36a3887b0bd75043401f"}, - {file = "livekit_agents-0.8.6.tar.gz", hash = "sha256:303192259a6e911b19002a041fdbec7e96cfa7055126b668d49e9420b87fa010"}, + {file = "livekit_agents-0.8.7-py3-none-any.whl", hash = "sha256:889e75c245acdc75ff8345f7c23b466d83217018df76e29ba05b98de82ed4b87"}, ] [package.dependencies] @@ -3324,9 +3195,6 @@ files = [ {file = "markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2"}, ] -[package.dependencies] -importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} - [package.extras] docs = ["mdx-gh-links (>=0.2)", "mkdocs (>=1.5)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] testing = ["coverage", "pyyaml"] @@ -3343,8 +3211,6 @@ files = [ ] [package.dependencies] -linkify-it-py = {version = ">=1,<3", optional = true, markers = "extra == \"linkify\""} -mdit-py-plugins = {version = "*", optional = true, markers = "extra == \"plugins\""} mdurl = ">=0.1,<1.0" [package.extras] @@ -3479,7 +3345,6 @@ files = [ contourpy = ">=1.0.1" cycler = ">=0.10" fonttools = ">=4.22.0" -importlib-resources = {version = ">=3.2.0", markers = "python_version < \"3.10\""} kiwisolver = ">=1.3.1" numpy = ">=1.23" packaging = ">=20.0" @@ -3504,25 +3369,6 @@ files = [ [package.dependencies] traitlets = "*" -[[package]] -name = "mdit-py-plugins" -version = "0.4.1" -description = "Collection of plugins for markdown-it-py" -optional = false -python-versions = ">=3.8" -files = [ - {file = "mdit_py_plugins-0.4.1-py3-none-any.whl", hash = "sha256:1020dfe4e6bfc2c79fb49ae4e3f5b297f5ccd20f010187acc52af2921e27dc6a"}, - {file = "mdit_py_plugins-0.4.1.tar.gz", hash = "sha256:834b8ac23d1cd60cec703646ffd22ae97b7955a6d596eb1d304be1e251ae499c"}, -] - -[package.dependencies] -markdown-it-py = ">=1.0.0,<4.0.0" - -[package.extras] -code-style = ["pre-commit"] -rtd = ["myst-parser", "sphinx-book-theme"] -testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"] - [[package]] name = "mdurl" version = "0.1.2" @@ -3595,21 +3441,6 @@ docs = ["sphinx"] gmpy = ["gmpy2 (>=2.1.0a4)"] tests = ["pytest (>=4.6)"] -[[package]] -name = "mpv" -version = "1.0.7" -description = "A python interface to the mpv media player" -optional = false -python-versions = ">=3.9" -files = [ - {file = "mpv-1.0.7-py3-none-any.whl", hash = "sha256:520fb134c18185b69c7fce4aa3514f14371028022d92eb193818e9fefb1e9fe8"}, - {file = "mpv-1.0.7.tar.gz", hash = "sha256:ae17d56176e05e4d046aa28a0732a478c0d58603e878e8da6d82b6c145ae1d82"}, -] - -[package.extras] -screenshot-raw = ["Pillow"] -test = ["PyVirtualDisplay"] - [[package]] name = "msgpack" version = "1.0.8" @@ -4194,51 +4025,53 @@ name = "open-interpreter" version = "0.3.8" description = "Let language models run code" optional = false -python-versions = ">=3.9,<4" -files = [] -develop = false - -[package.dependencies] -astor = "^0.8.1" -fastapi = {version = "^0.111.0", optional = true} -git-python = "^1.0.3" -google-generativeai = "^0.7.1" -html2image = "^2.0.4.3" -inquirer = "^3.1.3" -ipykernel = "^6.26.0" -ipywidgets = {version = "^8.1.2", optional = true} -janus = {version = "^1.0.0", optional = true} -jupyter-client = "^8.6.0" -litellm = "^1.41.26" -matplotlib = "^3.8.2" -opencv-python = {version = "^4.8.1.78", optional = true} -platformdirs = "^4.2.0" -plyer = {version = "^2.1.0", optional = true} -psutil = "^5.9.6" -pyautogui = {version = "^0.9.54", optional = true} -pydantic = "^2.6.4" -pyperclip = "^1.9.0" -pyreadline3 = {version = "^3.4.1", markers = "sys_platform == \"win32\""} -pytesseract = {version = "^0.3.10", optional = true} -pywinctl = {version = "^0.3", optional = true} -pyyaml = "^6.0.1" -rich = "^13.4.2" -screeninfo = {version = "^0.8.1", optional = true} -send2trash = "^1.8.2" -sentence-transformers = {version = "^2.5.1", optional = true} +python-versions = "<4,>=3.9" +files = [ + {file = "open_interpreter-0.3.8-py3-none-any.whl", hash = "sha256:facffdda4fe20718e902c9c3b53e30d5b15ea1a18df3c5a317f33f9f1d3fe859"}, + {file = "open_interpreter-0.3.8.tar.gz", hash = "sha256:9d336018b0111a2ad1e2ec163dac34bfc20327e8609d51575080c2d68b1e0209"}, +] + +[package.dependencies] +astor = ">=0.8.1,<0.9.0" +fastapi = {version = ">=0.111.0,<0.112.0", optional = true, markers = "extra == \"server\""} +git-python = ">=1.0.3,<2.0.0" +google-generativeai = ">=0.7.1,<0.8.0" +html2image = ">=2.0.4.3,<3.0.0.0" +inquirer = ">=3.1.3,<4.0.0" +ipykernel = ">=6.26.0,<7.0.0" +ipywidgets = {version = ">=8.1.2,<9.0.0", optional = true, markers = "extra == \"os\""} +janus = {version = ">=1.0.0,<2.0.0", optional = true, markers = "extra == \"server\""} +jupyter-client = ">=8.6.0,<9.0.0" +litellm = ">=1.41.26,<2.0.0" +matplotlib = ">=3.8.2,<4.0.0" +nltk = ">=3.8.1,<4.0.0" +opencv-python = {version = ">=4.8.1.78,<5.0.0.0", optional = true, markers = "extra == \"os\" or extra == \"local\""} +platformdirs = ">=4.2.0,<5.0.0" +plyer = {version = ">=2.1.0,<3.0.0", optional = true, markers = "extra == \"os\""} +psutil = ">=5.9.6,<6.0.0" +pyautogui = {version = ">=0.9.54,<0.10.0", optional = true, markers = "extra == \"os\""} +pydantic = ">=2.6.4,<3.0.0" +pyperclip = ">=1.9.0,<2.0.0" +pyreadline3 = {version = ">=3.4.1,<4.0.0", markers = "sys_platform == \"win32\""} +pytesseract = {version = ">=0.3.10,<0.4.0", optional = true, markers = "extra == \"os\" or extra == \"local\""} +pywinctl = {version = ">=0.3,<0.4", optional = true, markers = "extra == \"os\""} +pyyaml = ">=6.0.1,<7.0.0" +rich = ">=13.4.2,<14.0.0" +screeninfo = {version = ">=0.8.1,<0.9.0", optional = true, markers = "extra == \"os\""} +send2trash = ">=1.8.2,<2.0.0" +sentence-transformers = {version = ">=2.5.1,<3.0.0", optional = true, markers = "extra == \"os\""} setuptools = "*" -shortuuid = "^1.0.13" -six = "^1.16.0" -starlette = "^0.37.2" -tiktoken = "^0.7.0" -timm = {version = "^0.9.16", optional = true} -tokentrim = "^0.1.13" -toml = "^0.10.2" -torch = {version = "^2.2.1", optional = true} -typer = "^0.12.4" -uvicorn = {version = "^0.30.1", optional = true} -wget = "^3.2" -yaspin = "^3.0.2" +shortuuid = ">=1.0.13,<2.0.0" +six = ">=1.16.0,<2.0.0" +starlette = ">=0.37.2,<0.38.0" +tiktoken = ">=0.7.0,<0.8.0" +timm = {version = ">=0.9.16,<0.10.0", optional = true, markers = "extra == \"os\""} +tokentrim = ">=0.1.13,<0.2.0" +toml = ">=0.10.2,<0.11.0" +torch = {version = ">=2.2.1,<3.0.0", optional = true, markers = "extra == \"os\" or extra == \"local\""} +uvicorn = {version = ">=0.30.1,<0.31.0", optional = true, markers = "extra == \"server\""} +wget = ">=3.2,<4.0" +yaspin = ">=3.0.2,<4.0.0" [package.extras] local = ["easyocr (>=1.7.1,<2.0.0)", "einops (>=0.8.0,<0.9.0)", "opencv-python (>=4.8.1.78,<5.0.0.0)", "pytesseract (>=0.3.10,<0.4.0)", "torch (>=2.2.1,<3.0.0)", "torchvision (>=0.18.0,<0.19.0)", "transformers (==4.41.2)"] @@ -4246,12 +4079,6 @@ os = ["ipywidgets (>=8.1.2,<9.0.0)", "opencv-python (>=4.8.1.78,<5.0.0.0)", "ply safe = ["semgrep (>=1.52.0,<2.0.0)"] server = ["fastapi (>=0.111.0,<0.112.0)", "janus (>=1.0.0,<2.0.0)", "uvicorn (>=0.30.1,<0.31.0)"] -[package.source] -type = "git" -url = "https://github.com/OpenInterpreter/open-interpreter.git" -reference = "development" -resolved_reference = "34fd952e9cdc520b27a3523aa066dae06c3fc82e" - [[package]] name = "openai" version = "1.36.1" @@ -4294,12 +4121,33 @@ files = [ [package.dependencies] numpy = [ {version = ">=1.23.5", markers = "python_version >= \"3.11\""}, - {version = ">=1.21.0", markers = "python_version == \"3.9\" and platform_system == \"Darwin\" and platform_machine == \"arm64\""}, {version = ">=1.21.4", markers = "python_version >= \"3.10\" and platform_system == \"Darwin\" and python_version < \"3.11\""}, {version = ">=1.21.2", markers = "platform_system != \"Darwin\" and python_version >= \"3.10\" and python_version < \"3.11\""}, - {version = ">=1.19.3", markers = "platform_system == \"Linux\" and platform_machine == \"aarch64\" and python_version >= \"3.8\" and python_version < \"3.10\" or python_version > \"3.9\" and python_version < \"3.10\" or python_version >= \"3.9\" and platform_system != \"Darwin\" and python_version < \"3.10\" or python_version >= \"3.9\" and platform_machine != \"arm64\" and python_version < \"3.10\""}, ] +[[package]] +name = "openwakeword" +version = "0.6.0" +description = "An open-source audio wake word (or phrase) detection framework with a focus on performance and simplicity" +optional = false +python-versions = ">=3.7" +files = [ + {file = "openwakeword-0.6.0-py3-none-any.whl", hash = "sha256:6f423a4e3ae9dd0e3cd12b50ff8abf69679f687b4ab349d7c82c021c0e2abc9d"}, + {file = "openwakeword-0.6.0.tar.gz", hash = "sha256:36858d90f1183e307485597a912a4e3c3384b14ea9923f83feaffae7c1565565"}, +] + +[package.dependencies] +onnxruntime = ">=1.10.0,<2" +requests = ">=2.0,<3" +scikit-learn = ">=1,<2" +scipy = ">=1.3,<2" +tflite-runtime = {version = ">=2.8.0,<3", markers = "platform_system == \"Linux\""} +tqdm = ">=4.0,<5.0" + +[package.extras] +full = ["acoustics (>=0.2.6,<1)", "audiomentations (>=0.30.0,<1)", "datasets (>=2.14.4,<3)", "deep-phonemizer (==0.0.19)", "mutagen (>=1.46.0,<2)", "onnx (==1.14.0)", "onnx-tf (==1.10.0)", "pronouncing (>=0.2.0,<1)", "protobuf (>=3.20,<4)", "pytest (>=7.2.0,<8)", "pytest-cov (>=2.10.1,<3)", "pytest-flake8 (>=1.1.1,<2)", "pytest-mypy (>=0.10.0,<1)", "pyyaml (>=6.0,<7)", "speechbrain (>=0.5.14,<1)", "tensorflow-cpu (==2.8.1)", "tensorflow-probability (==0.16.0)", "torch (>=1.13.1,<3)", "torch-audiomentations (>=0.11.0,<1)", "torchaudio (>=0.13.1,<1)", "torchinfo (>=1.8.0,<2)", "torchmetrics (>=0.11.4,<1)", "tqdm (>=4.64.0,<5)"] +test = ["flake8 (>=4.0,<4.1)", "mock (>=5.1,<6)", "pytest (>=7.2.0,<8)", "pytest-cov (>=2.10.1,<3)", "pytest-flake8 (>=1.1.1,<2)", "pytest-mypy (>=0.10.0,<1)", "types-PyYAML", "types-mock (>=5.1,<6)", "types-requests", "types-requests (>=2.0,<3)"] + [[package]] name = "packaging" version = "24.1" @@ -4621,16 +4469,6 @@ files = [ {file = "protobuf-4.25.4.tar.gz", hash = "sha256:0dc4a62cc4052a036ee2204d26fe4d835c62827c855c8a03f29fe6da146b380d"}, ] -[[package]] -name = "proxy-tools" -version = "0.1.0" -description = "Proxy Implementation" -optional = false -python-versions = "*" -files = [ - {file = "proxy_tools-0.1.0.tar.gz", hash = "sha256:ccb3751f529c047e2d8a58440d86b205303cf0fe8146f784d1cbcd94f0a28010"}, -] - [[package]] name = "psutil" version = "5.9.8" @@ -4975,72 +4813,6 @@ files = [ {file = "pydub-0.25.1.tar.gz", hash = "sha256:980a33ce9949cab2a569606b65674d748ecbca4f0796887fd6f46173a7b0d30f"}, ] -[[package]] -name = "pygame" -version = "2.6.0" -description = "Python Game Development" -optional = false -python-versions = ">=3.6" -files = [ - {file = "pygame-2.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e5707aa9d029752495b3eddc1edff62e0e390a02f699b0f1ce77fe0b8c70ea4f"}, - {file = "pygame-2.6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d3ed0547368733b854c0d9981c982a3cdfabfa01b477d095c57bf47f2199da44"}, - {file = "pygame-2.6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6050f3e95f1f16602153d616b52619c6a2041cee7040eb529f65689e9633fc3e"}, - {file = "pygame-2.6.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:89be55b7e9e22e0eea08af9d6cfb97aed5da780f0b3a035803437d481a16d972"}, - {file = "pygame-2.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d65fb222eea1294cfc8206d9e5754d476a1673eb2783c03c4f70e0455320274"}, - {file = "pygame-2.6.0-cp310-cp310-win32.whl", hash = "sha256:71eebb9803cb350298de188fb7cdd3ebf13299f78d59a71c7e81efc649aae348"}, - {file = "pygame-2.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:1551852a2cd5b4139a752888f6cbeeb4a96fc0fe6e6f3f8b9d9784eb8fceab13"}, - {file = "pygame-2.6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f6e5e6c010b1bf429388acf4d41d7ab2f7ad8fbf241d0db822102d35c9a2eb84"}, - {file = "pygame-2.6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:99902f4a2f6a338057200d99b5120a600c27a9f629ca012a9b0087c045508d08"}, - {file = "pygame-2.6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6a284664978a1989c1e31a0888b2f70cfbcbafdfa3bb310e750b0d3366416225"}, - {file = "pygame-2.6.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:829623cee298b3dbaa1dd9f52c3051ae82f04cad7708c8c67cb9a1a4b8fd3c0b"}, - {file = "pygame-2.6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6acf7949ed764487d51123f4f3606e8f76b0df167fef12ef73ef423c35fdea39"}, - {file = "pygame-2.6.0-cp311-cp311-win32.whl", hash = "sha256:3f809560c99bd1fb4716610eca0cd36412528f03da1a63841a347b71d0c604ee"}, - {file = "pygame-2.6.0-cp311-cp311-win_amd64.whl", hash = "sha256:6897ab87f9193510a774a3483e00debfe166f340ca159f544ef99807e2a44ec4"}, - {file = "pygame-2.6.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:b834711ebc8b9d0c2a5f9bfae4403dd277b2c61bcb689e1aa630d01a1ebcf40a"}, - {file = "pygame-2.6.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b5ac288655e8a31a303cc286e79cc57979ed2ba19c3a14042d4b6391c1d3bed2"}, - {file = "pygame-2.6.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d666667b7826b0a7921b8ce0a282ba5281dfa106976c1a3b24e32a0af65ad3b1"}, - {file = "pygame-2.6.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd8848a37a7cee37854c7efb8d451334477c9f8ce7ac339c079e724dc1334a76"}, - {file = "pygame-2.6.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:315e7b3c1c573984f549ac5da9778ac4709b3b4e3a4061050d94eab63fa4fe31"}, - {file = "pygame-2.6.0-cp312-cp312-win32.whl", hash = "sha256:e44bde0840cc21a91c9d368846ac538d106cf0668be1a6030f48df139609d1e8"}, - {file = "pygame-2.6.0-cp312-cp312-win_amd64.whl", hash = "sha256:1c429824b1f881a7a5ce3b5c2014d3d182aa45a22cea33c8347a3971a5446907"}, - {file = "pygame-2.6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b832200bd8b6fc485e087bf3ef7ec1a21437258536413a5386088f5dcd3a9870"}, - {file = "pygame-2.6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:098029d01a46ea4e30620dfb7c28a577070b456c8fc96350dde05f85c0bf51b5"}, - {file = "pygame-2.6.0-cp36-cp36m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a858bbdeac5ec473ec9e726c55fb8fbdc2f4aad7c55110e899883738071c7c9b"}, - {file = "pygame-2.6.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f908762941fd99e1f66d1211d26383184f6045c45673443138b214bf48a89aa"}, - {file = "pygame-2.6.0-cp36-cp36m-win32.whl", hash = "sha256:4a63daee99d050f47d6ec7fa7dbd1c6597b8f082cdd58b6918d382d2bc31262d"}, - {file = "pygame-2.6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:ace471b3849d68968e5427fc01166ef5afaf552a5c442fc2c28d3b7226786f55"}, - {file = "pygame-2.6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fea019713d0c89dfd5909225aa933010100035d1cd30e6c936e8b6f00529fb80"}, - {file = "pygame-2.6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:249dbf2d51d9f0266009a380ccf0532e1a57614a1528bb2f89a802b01d61f93e"}, - {file = "pygame-2.6.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8cb51533ee3204e8160600b0de34eaad70eb913a182c94a7777b6051e8fc52f1"}, - {file = "pygame-2.6.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f637636a44712e94e5601ec69160a080214626471983dfb0b5b68aa0c61563d"}, - {file = "pygame-2.6.0-cp37-cp37m-win32.whl", hash = "sha256:e432156b6f346f4cc6cab03ce9657600093390f4c9b10bf458716b25beebfe33"}, - {file = "pygame-2.6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:a0194652db7874bdde7dfc69d659ca954544c012e04ae527151325bfb970f423"}, - {file = "pygame-2.6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:eae3ee62cc172e268121d5bd9dc406a67094d33517de3a91de3323d6ae23eb02"}, - {file = "pygame-2.6.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f6a58b0a5a8740a3c2cf6fc5366888bd4514561253437f093c12a9ab4fb3ecae"}, - {file = "pygame-2.6.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c71da36997dc7b9b4ee973fa3a5d4a6cfb2149161b5b1c08b712d2f13a63ccfe"}, - {file = "pygame-2.6.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b86771801a7fc10d9a62218f27f1d5c13341c3a27394aa25578443a9cd199830"}, - {file = "pygame-2.6.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4928f3acf5a9ce5fbab384c21f1245304535ffd5fb167ae92a6b4d3cdb55a3b6"}, - {file = "pygame-2.6.0-cp38-cp38-win32.whl", hash = "sha256:4faab2df9926c4d31215986536b112f0d76f711cf02f395805f1ff5df8fd55fc"}, - {file = "pygame-2.6.0-cp38-cp38-win_amd64.whl", hash = "sha256:afbb8d97aed93dfb116fe105603dacb68f8dab05b978a40a9e4ab1b6c1f683fd"}, - {file = "pygame-2.6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d11f3646b53819892f4a731e80b8589a9140343d0d4b86b826802191b241228c"}, - {file = "pygame-2.6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:5ef92ed93c354eabff4b85e457d4d6980115004ec7ff52a19fd38b929c3b80fb"}, - {file = "pygame-2.6.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9bc1795f2e36302882546faacd5a0191463c4f4ae2b90e7c334a7733aa4190d2"}, - {file = "pygame-2.6.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e92294fcc85c4955fe5bc6a0404e4cc870808005dc8f359e881544e3cc214108"}, - {file = "pygame-2.6.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0cb7bdf3ee0233a3ac02ef777c01dfe315e6d4670f1312c83b91c1ef124359a"}, - {file = "pygame-2.6.0-cp39-cp39-win32.whl", hash = "sha256:ac906478ae489bb837bf6d2ae1eb9261d658aa2c34fa5b283027a04149bda81a"}, - {file = "pygame-2.6.0-cp39-cp39-win_amd64.whl", hash = "sha256:92cf12a9722f6f0bdc5520d8925a8f085cff9c054a2ea462fc409cba3781be27"}, - {file = "pygame-2.6.0-pp36-pypy36_pp73-win32.whl", hash = "sha256:a6636f452fdaddf604a060849feb84c056930b6a3c036214f607741f16aac942"}, - {file = "pygame-2.6.0-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3dc242dc15d067d10f25c5b12a1da48ca9436d8e2d72353eaf757e83612fba2f"}, - {file = "pygame-2.6.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f82df23598a281c8c342d3c90be213c8fe762a26c15815511f60d0aac6e03a70"}, - {file = "pygame-2.6.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2ed2539bb6bd211fc570b1169dc4a64a74ec5cd95741e62a0ab46bd18fe08e0d"}, - {file = "pygame-2.6.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:904aaf29710c6b03a7e1a65b198f5467ed6525e8e60bdcc5e90ff8584c1d54ea"}, - {file = "pygame-2.6.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fcd28f96f0fffd28e71a98773843074597e10d7f55a098e2e5bcb2bef1bdcbf5"}, - {file = "pygame-2.6.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4fad1ab33443ecd4f958dbbb67fc09fcdc7a37e26c34054e3296fb7e26ad641e"}, - {file = "pygame-2.6.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e909186d4d512add39b662904f0f79b73028fbfc4fbfdaf6f9412aed4e500e9c"}, - {file = "pygame-2.6.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:79abcbf6d12fce51a955a0652ccd50b6d0a355baa27799535eaf21efb43433dd"}, - {file = "pygame-2.6.0.tar.gz", hash = "sha256:722d33ae676aa8533c1f955eded966411298831346b8d51a77dad22e46ba3e35"}, -] - [[package]] name = "pygetwindow" version = "0.0.9" @@ -8002,13 +7774,13 @@ pyobjc-framework-Cocoa = ">=10.3.1" [[package]] name = "pyparsing" -version = "3.1.2" +version = "3.1.4" description = "pyparsing module - Classes and methods to define and execute parsing grammars" optional = false python-versions = ">=3.6.8" files = [ - {file = "pyparsing-3.1.2-py3-none-any.whl", hash = "sha256:f9db75911801ed778fe61bb643079ff86601aca99fcae6345aa67292038fb742"}, - {file = "pyparsing-3.1.2.tar.gz", hash = "sha256:a1bac0ce561155ecc3ed78ca94d3c9378656ad4c94c1270de543f621420f94ad"}, + {file = "pyparsing-3.1.4-py3-none-any.whl", hash = "sha256:a6a7ee4235a3f944aa1fa2249307708f893fe5717dc603503c6c7969c070fb7c"}, + {file = "pyparsing-3.1.4.tar.gz", hash = "sha256:f86ec8d1a83f11977c9a6ea7598e8c27fc5cddfa5b07ea2241edbbde1d7bc032"}, ] [package.extras] @@ -8038,20 +7810,6 @@ files = [ [package.dependencies] pywin32 = ">=223" -[[package]] -name = "pyqrcode" -version = "1.2.1" -description = "A QR code generator written purely in Python with SVG, EPS, PNG and terminal output." -optional = false -python-versions = "*" -files = [ - {file = "PyQRCode-1.2.1.tar.gz", hash = "sha256:fdbf7634733e56b72e27f9bce46e4550b75a3a2c420414035cae9d9d26b234d5"}, - {file = "PyQRCode-1.2.1.zip", hash = "sha256:1b2812775fa6ff5c527977c4cd2ccb07051ca7d0bc0aecf937a43864abe5eff6"}, -] - -[package.extras] -png = ["pypng (>=0.0.13)"] - [[package]] name = "pyreadline3" version = "3.4.1" @@ -8096,7 +7854,7 @@ files = [ [package.dependencies] Pillow = [ {version = ">=9.3.0", markers = "python_version == \"3.11\""}, - {version = ">=9.2.0", markers = "python_version == \"3.10\" or python_version == \"3.9\""}, + {version = ">=9.2.0", markers = "python_version == \"3.10\""}, ] [[package]] @@ -8195,24 +7953,6 @@ files = [ {file = "python_crfsuite-0.9.10-cp39-cp39-win_amd64.whl", hash = "sha256:da8065383e41efe65d87de6fa83f1682a8ef65f26370300042ef88891971450c"}, ] -[[package]] -name = "python-crontab" -version = "3.2.0" -description = "Python Crontab API" -optional = false -python-versions = "*" -files = [ - {file = "python_crontab-3.2.0-py3-none-any.whl", hash = "sha256:82cb9b6a312d41ff66fd3caf3eed7115c28c195bfb50711bc2b4b9592feb9fe5"}, - {file = "python_crontab-3.2.0.tar.gz", hash = "sha256:40067d1dd39ade3460b2ad8557c7651514cd3851deffff61c5c60e1227c5c36b"}, -] - -[package.dependencies] -python-dateutil = "*" - -[package.extras] -cron-description = ["cron-descriptor"] -cron-schedule = ["croniter"] - [[package]] name = "python-dateutil" version = "2.9.0.post0" @@ -8279,31 +8019,6 @@ files = [ {file = "python3-xlib-0.15.tar.gz", hash = "sha256:dc4245f3ae4aa5949c1d112ee4723901ade37a96721ba9645f2bfa56e5b383f8"}, ] -[[package]] -name = "pythonnet" -version = "3.0.3" -description = ".NET and Mono integration for Python" -optional = false -python-versions = "<3.13,>=3.7" -files = [ - {file = "pythonnet-3.0.3-py3-none-any.whl", hash = "sha256:62486f860c7955b7dcf470e085e4d2b599512224ca24193f716e857b496c530f"}, - {file = "pythonnet-3.0.3.tar.gz", hash = "sha256:8d4b2e97158a023875f8647458a58f38817f4fe39af60abdd6b0d8adf1d77e75"}, -] - -[package.dependencies] -clr-loader = ">=0.2.6,<0.3.0" - -[[package]] -name = "pytimeparse" -version = "1.1.8" -description = "Time expression parser" -optional = false -python-versions = "*" -files = [ - {file = "pytimeparse-1.1.8-py2.py3-none-any.whl", hash = "sha256:04b7be6cc8bd9f5647a6325444926c3ac34ee6bc7e69da4367ba282f076036bd"}, - {file = "pytimeparse-1.1.8.tar.gz", hash = "sha256:e86136477be924d7e670646a98561957e8ca7308d44841e21f5ddea757556a0a"}, -] - [[package]] name = "pyttsx3" version = "2.90" @@ -8341,37 +8056,6 @@ files = [ {file = "pytz-2024.1.tar.gz", hash = "sha256:2a29735ea9c18baf14b448846bde5a48030ed267578472d8955cd0e7443a9812"}, ] -[[package]] -name = "pywebview" -version = "5.2" -description = "Build GUI for your Python program with JavaScript, HTML, and CSS" -optional = false -python-versions = ">=3.7" -files = [ - {file = "pywebview-5.2-py3-none-any.whl", hash = "sha256:07acceb74bfeed2b5bf19b9535f23ed68330ec6488ae63aea4d35290941cad7f"}, - {file = "pywebview-5.2.tar.gz", hash = "sha256:634c6e4547ef3f4de2a35cdaed59464b60fb61247f3f6017d6de4ddc1a2dadc2"}, -] - -[package.dependencies] -bottle = "*" -proxy-tools = "*" -pyobjc-core = {version = "*", markers = "sys_platform == \"darwin\""} -pyobjc-framework-Cocoa = {version = "*", markers = "sys_platform == \"darwin\""} -pyobjc-framework-Quartz = {version = "*", markers = "sys_platform == \"darwin\""} -pyobjc-framework-security = {version = "*", markers = "sys_platform == \"darwin\""} -pyobjc-framework-WebKit = {version = "*", markers = "sys_platform == \"darwin\""} -pythonnet = {version = "*", markers = "sys_platform == \"win32\""} -QtPy = {version = "*", markers = "sys_platform == \"openbsd6\""} -typing-extensions = "*" - -[package.extras] -android = ["jnius", "kivy"] -cef = ["cefpython3"] -gtk = ["PyGObject", "PyGObject-stubs"] -pyside2 = ["PySide2", "QtPy"] -pyside6 = ["PySide6", "QtPy"] -qt = ["PyQt5", "QtPy", "pyqtwebengine"] - [[package]] name = "pywin32" version = "306" @@ -8500,142 +8184,125 @@ files = [ [[package]] name = "pyzmq" -version = "26.1.1" +version = "26.2.0" description = "Python bindings for 0MQ" optional = false python-versions = ">=3.7" files = [ - {file = "pyzmq-26.1.1-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:b1bb952d1e407463c9333ea7e0c0600001e54e08ce836d4f0aff1fb3f902cf63"}, - {file = "pyzmq-26.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:65e2a18e845c6ea7ab849c70db932eaeadee5edede9e379eb21c0a44cf523b2e"}, - {file = "pyzmq-26.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:def7ae3006924b8a0c146a89ab4008310913fa903beedb95e25dea749642528e"}, - {file = "pyzmq-26.1.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a8234571df7816f99dde89c3403cb396d70c6554120b795853a8ea56fcc26cd3"}, - {file = "pyzmq-26.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18da8e84dbc30688fd2baefd41df7190607511f916be34f9a24b0e007551822e"}, - {file = "pyzmq-26.1.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:c70dab93d98b2bf3f0ac1265edbf6e7f83acbf71dabcc4611889bb0dea45bed7"}, - {file = "pyzmq-26.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:fcb90592c5d5c562e1b1a1ceccf6f00036d73c51db0271bf4d352b8d6b31d468"}, - {file = "pyzmq-26.1.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:cf4be7460a0c1bc71e9b0e64ecdd75a86386ca6afaa36641686f5542d0314e9d"}, - {file = "pyzmq-26.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a4cbecda4ddbfc1e309c3be04d333f9be3fc6178b8b6592b309676f929767a15"}, - {file = "pyzmq-26.1.1-cp310-cp310-win32.whl", hash = "sha256:583f73b113b8165713b6ce028d221402b1b69483055b5aa3f991937e34dd1ead"}, - {file = "pyzmq-26.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:5e6f39ecb8eb7bfcb976c49262e8cf83ff76e082b77ca23ba90c9b6691a345be"}, - {file = "pyzmq-26.1.1-cp310-cp310-win_arm64.whl", hash = "sha256:8d042d6446cab3a1388b38596f5acabb9926b0b95c3894c519356b577a549458"}, - {file = "pyzmq-26.1.1-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:362cac2423e36966d336d79d3ec3eafeabc153ee3e7a5cf580d7e74a34b3d912"}, - {file = "pyzmq-26.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0841633446cb1539a832a19bb24c03a20c00887d0cedd1d891b495b07e5c5cb5"}, - {file = "pyzmq-26.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e1fcdc333afbf9918d0a614a6e10858aede7da49a60f6705a77e343fe86a317"}, - {file = "pyzmq-26.1.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cc8d655627d775475eafdcf0e49e74bcc1e5e90afd9ab813b4da98f092ed7b93"}, - {file = "pyzmq-26.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:32de51744820857a6f7c3077e620ab3f607d0e4388dfead885d5124ab9bcdc5e"}, - {file = "pyzmq-26.1.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:a880240597010914ffb1d6edd04d3deb7ce6a2abf79a0012751438d13630a671"}, - {file = "pyzmq-26.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:26131b1cec02f941ed2d2b4b8cc051662b1c248b044eff5069df1f500bbced56"}, - {file = "pyzmq-26.1.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:ce05841322b58510607f9508a573138d995a46c7928887bc433de9cb760fd2ad"}, - {file = "pyzmq-26.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:32123ff0a6db521aadf2b95201e967a4e0d11fb89f73663a99d2f54881c07214"}, - {file = "pyzmq-26.1.1-cp311-cp311-win32.whl", hash = "sha256:e790602d7ea1d6c7d8713d571226d67de7ffe47b1e22ae2c043ebd537de1bccb"}, - {file = "pyzmq-26.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:717960855f2d6fdc2dba9df49dff31c414187bb11c76af36343a57d1f7083d9a"}, - {file = "pyzmq-26.1.1-cp311-cp311-win_arm64.whl", hash = "sha256:08956c26dbcd4fd8835cb777a16e21958ed2412317630e19f0018d49dbeeb470"}, - {file = "pyzmq-26.1.1-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:e80345900ae241c2c51bead7c9fa247bba6d4b2a83423e9791bae8b0a7f12c52"}, - {file = "pyzmq-26.1.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ec8fe214fcc45dfb0c32e4a7ad1db20244ba2d2fecbf0cbf9d5242d81ca0a375"}, - {file = "pyzmq-26.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf4e283f97688d993cb7a8acbc22889effbbb7cbaa19ee9709751f44be928f5d"}, - {file = "pyzmq-26.1.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2508bdc8ab246e5ed7c92023d4352aaad63020ca3b098a4e3f1822db202f703d"}, - {file = "pyzmq-26.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:741bdb4d96efe8192616abdc3671931d51a8bcd38c71da2d53fb3127149265d1"}, - {file = "pyzmq-26.1.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:76154943e4c4054b2591792eb3484ef1dd23d59805759f9cebd2f010aa30ee8c"}, - {file = "pyzmq-26.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9498ac427d20d0e0ef0e4bbd6200841e91640dfdf619f544ceec7f464cfb6070"}, - {file = "pyzmq-26.1.1-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:6f34453ef3496ca3462f30435bf85f535f9550392987341f9ccc92c102825a79"}, - {file = "pyzmq-26.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:50f0669324e27cc2091ef6ab76ca7112f364b6249691790b4cffce31e73fda28"}, - {file = "pyzmq-26.1.1-cp312-cp312-win32.whl", hash = "sha256:3ee5cbf2625b94de21c68d0cefd35327c8dfdbd6a98fcc41682b4e8bb00d841f"}, - {file = "pyzmq-26.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:75bd448a28b1001b6928679015bc95dd5f172703ed30135bb9e34fc9cda0a3e7"}, - {file = "pyzmq-26.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:4350233569b4bbef88595c5e77ee38995a6f1f1790fae148b578941bfffd1c24"}, - {file = "pyzmq-26.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6c8087a3281c20b1d11042d372ed5a47734af05975d78e4d1d6e7bd1018535f3"}, - {file = "pyzmq-26.1.1-cp313-cp313-macosx_10_15_universal2.whl", hash = "sha256:ebef7d3fe11fe4c688f08bc0211a976c3318c097057f258428200737b9fff4da"}, - {file = "pyzmq-26.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7a5342110510045a47de1e87f5f1dcc1d9d90109522316dc9830cfc6157c800f"}, - {file = "pyzmq-26.1.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:af690ea4be6ca92a67c2b44a779a023bf0838e92d48497a2268175dc4a505691"}, - {file = "pyzmq-26.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc994e220c1403ae087d7f0fa45129d583e46668a019e389060da811a5a9320e"}, - {file = "pyzmq-26.1.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:b8e153f5dffb0310af71fc6fc9cd8174f4c8ea312c415adcb815d786fee78179"}, - {file = "pyzmq-26.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:0065026e624052a51033857e5cd45a94b52946b44533f965f0bdf182460e965d"}, - {file = "pyzmq-26.1.1-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:63351392f948b5d50b9f55161994bc4feedbfb3f3cfe393d2f503dea2c3ec445"}, - {file = "pyzmq-26.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ffecc43b3c18e36b62fcec995761829b6ac325d8dd74a4f2c5c1653afbb4495a"}, - {file = "pyzmq-26.1.1-cp313-cp313-win32.whl", hash = "sha256:6ff14c2fae6c0c2c1c02590c5c5d75aa1db35b859971b3ca2fcd28f983d9f2b6"}, - {file = "pyzmq-26.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:85f2d2ee5ea9a8f1de86a300e1062fbab044f45b5ce34d20580c0198a8196db0"}, - {file = "pyzmq-26.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:cc09b1de8b985ca5a0ca343dd7fb007267c6b329347a74e200f4654268084239"}, - {file = "pyzmq-26.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:bc904e86de98f8fc5bd41597da5d61232d2d6d60c4397f26efffabb961b2b245"}, - {file = "pyzmq-26.1.1-cp313-cp313t-macosx_10_15_universal2.whl", hash = "sha256:00f39c367bbd6aa8e4bc36af6510561944c619b58eb36199fa334b594a18f615"}, - {file = "pyzmq-26.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:de6f384864a959866b782e6a3896538d1424d183f2d3c7ef079f71dcecde7284"}, - {file = "pyzmq-26.1.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3abb15df0c763339edb27a644c19381b2425ddd1aea3dbd77c1601a3b31867b8"}, - {file = "pyzmq-26.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40908ec2dd3b29bbadc0916a0d3c87f8dbeebbd8fead8e618539f09e0506dec4"}, - {file = "pyzmq-26.1.1-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:c11a95d3f6fc7e714ccd1066f68f9c1abd764a8b3596158be92f46dd49f41e03"}, - {file = "pyzmq-26.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:4437af9fee7a58302dbd511cc49f0cc2b35c112a33a1111fb123cf0be45205ca"}, - {file = "pyzmq-26.1.1-cp313-cp313t-musllinux_1_1_i686.whl", hash = "sha256:76390d3d66406cb01b9681c382874400e9dfd77f30ecdea4bd1bf5226dd4aff0"}, - {file = "pyzmq-26.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:4d4c7fe5e50e269f9c63a260638488fec194a73993008618a59b54c47ef6ae72"}, - {file = "pyzmq-26.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:25d128524207f53f7aae7c5abdc2b63f8957a060b00521af5ffcd20986b5d8f4"}, - {file = "pyzmq-26.1.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:d74b925d997e4f92b042bdd7085cd0a309ee0fd7cb4dc376059bbff6b32ff34f"}, - {file = "pyzmq-26.1.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:732f957441e5b1c65a7509395e6b6cafee9e12df9aa5f4bf92ed266fe0ba70ee"}, - {file = "pyzmq-26.1.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f0a45102ad7ed9f9ddf2bd699cc5df37742cf7301111cba06001b927efecb120"}, - {file = "pyzmq-26.1.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:9f380d5333fc7cd17423f486125dcc073918676e33db70a6a8172b19fc78d23d"}, - {file = "pyzmq-26.1.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:8eaffcd6bf6a9d00b66a2052a33fa7e6a6575427e9644395f13c3d070f2918dc"}, - {file = "pyzmq-26.1.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:f1483d4975ae1b387b39bb8e23d1ff32fe5621aa9e4ed3055d05e9c5613fea53"}, - {file = "pyzmq-26.1.1-cp37-cp37m-win32.whl", hash = "sha256:a83653c6bbe5887caea55e49fbd2909c14b73acf43bcc051eb60b2d514bbd46e"}, - {file = "pyzmq-26.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9763a8d3f5f74ef679989b373c37cc22e8d07e56d26439205cb83edb7722357f"}, - {file = "pyzmq-26.1.1-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:2b045647caf620ce0ed6c8fd9fb6a73116f99aceed966b152a5ba1b416d25311"}, - {file = "pyzmq-26.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f66dcb6625c002f209cdc12cae1a1fec926493cd2262efe37dc6b25a30cea863"}, - {file = "pyzmq-26.1.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0cf1d980c969fb9e538f52abd2227f09e015096bc5c3ef7aa26e0d64051c1db8"}, - {file = "pyzmq-26.1.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:443ebf5e261a95ee9725693f2a5a71401f89b89df0e0ea58844b074067aac2f1"}, - {file = "pyzmq-26.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29de77ba1b1877fe7defc1b9140e65cbd35f72a63bc501e56c2eae55bde5fff4"}, - {file = "pyzmq-26.1.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f6071ec95af145d7b659dae6786871cd85f0acc599286b6f8ba0c74592d83dd"}, - {file = "pyzmq-26.1.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:6f0512fc87629ad968889176bf2165d721cd817401a281504329e2a2ed0ca6a3"}, - {file = "pyzmq-26.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5ccfcf13e80719f6a2d9c0a021d9e47d4550907a29253554be2c09582f6d7963"}, - {file = "pyzmq-26.1.1-cp38-cp38-win32.whl", hash = "sha256:809673947e95752e407aaaaf03f205ee86ebfff9ca51db6d4003dfd87b8428d1"}, - {file = "pyzmq-26.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:62b5180e23e6f581600459cd983473cd723fdc64350f606d21407c99832aaf5f"}, - {file = "pyzmq-26.1.1-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:fe73d7c89d6f803bed122135ff5783364e8cdb479cf6fe2d764a44b6349e7e0f"}, - {file = "pyzmq-26.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:db1b7e2b50ef21f398036786da4c153db63203a402396d9f21e08ea61f3f8dba"}, - {file = "pyzmq-26.1.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:7c506a51cb01bb997a3f6440db0d121e5e7a32396e9948b1fdb6a7bfa67243f4"}, - {file = "pyzmq-26.1.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:92eca4f80e8a748d880e55d3cf57ef487692e439f12d5c5a2e1cce84aaa7f6cb"}, - {file = "pyzmq-26.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14bdbae02f72f4716b0ffe7500e9da303d719ddde1f3dcfb4c4f6cc1cf73bb02"}, - {file = "pyzmq-26.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e03be7ed17836c9434cce0668ac1e2cc9143d7169f90f46a0167f6155e176e32"}, - {file = "pyzmq-26.1.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc5df31e36e4fddd4c8b5c42daee8d54d7b529e898ac984be97bf5517de166a7"}, - {file = "pyzmq-26.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:f218179c90a12d660906e04b25a340dd63e9743000ba16232ddaf46888f269da"}, - {file = "pyzmq-26.1.1-cp39-cp39-win32.whl", hash = "sha256:7dfabc180a4da422a4b349c63077347392463a75fa07aa3be96712ed6d42c547"}, - {file = "pyzmq-26.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:c5248e6e0fcbbbc912982e99cdd51c342601f495b0fa5bd667f3bdbdbf3e170f"}, - {file = "pyzmq-26.1.1-cp39-cp39-win_arm64.whl", hash = "sha256:2ae7aa1408778dc74582a1226052b930f9083b54b64d7e6ef6ec0466cfdcdec2"}, - {file = "pyzmq-26.1.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:be3fc2b11c0c384949cf1f01f9a48555039408b0f3e877863b1754225635953e"}, - {file = "pyzmq-26.1.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:48dee75c2a9fa4f4a583d4028d564a0453447ee1277a29b07acc3743c092e259"}, - {file = "pyzmq-26.1.1-pp310-pypy310_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:23f2fe4fb567e8098ebaa7204819658195b10ddd86958a97a6058eed2901eed3"}, - {file = "pyzmq-26.1.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:472cacd16f627c06d3c8b2d374345ab74446bae913584a6245e2aa935336d929"}, - {file = "pyzmq-26.1.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:8285b25aa20fcc46f1ca4afbc39fd3d5f2fe4c4bbf7f2c7f907a214e87a70024"}, - {file = "pyzmq-26.1.1-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2067e63fd9d5c13cfe12624dab0366053e523b37a7a01678ce4321f839398939"}, - {file = "pyzmq-26.1.1-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:cc109be2ee3638035d276e18eaf66a1e1f44201c0c4bea4ee0c692766bbd3570"}, - {file = "pyzmq-26.1.1-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:d0da97e65ee73261dba70469cc8f63d8da3a8a825337a2e3d246b9e95141cdd0"}, - {file = "pyzmq-26.1.1-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa79c528706561306938b275f89bb2c6985ce08469c27e5de05bc680df5e826f"}, - {file = "pyzmq-26.1.1-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:3ddbd851a3a2651fdc5065a2804d50cf2f4b13b1bcd66de8e9e855d0217d4fcd"}, - {file = "pyzmq-26.1.1-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d3df226ab7464684ae6706e20a5cbab717c3735a7e409b3fa598b754d49f1946"}, - {file = "pyzmq-26.1.1-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:abad7b897e960d577eb4a0f3f789c1780bc3ffe2e7c27cf317e7c90ad26acf12"}, - {file = "pyzmq-26.1.1-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:c513d829a548c2d5c88983167be2b3aa537f6d1191edcdc6fcd8999e18bdd994"}, - {file = "pyzmq-26.1.1-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:70af4c9c991714ef1c65957605a8de42ef0d0620dd5f125953c8e682281bdb80"}, - {file = "pyzmq-26.1.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:8d4234f335b0d0842f7d661d8cd50cbad0729be58f1c4deb85cd96b38fe95025"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:2c0fdb7b758e0e1605157e480b00b3a599073068a37091a1c75ec65bf7498645"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fc657577f057d60dd3642c9f95f28b432889b73143140061f7c1331d02f03df6"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3e3b66fe6131b4f33d239f7d4c3bfb2f8532d8644bae3b3da4f3987073edac55"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59b57e912feef6951aec8bb03fe0faa5ad5f36962883c72a30a9c965e6d988fd"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:146956aec7d947c5afc5e7da0841423d7a53f84fd160fff25e682361dcfb32cb"}, - {file = "pyzmq-26.1.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:9521b874fd489495865172f344e46e0159095d1f161858e3fc6e28e43ca15160"}, - {file = "pyzmq-26.1.1.tar.gz", hash = "sha256:a7db05d8b7cd1a8c6610e9e9aa55d525baae7a44a43e18bc3260eb3f92de96c6"}, + {file = "pyzmq-26.2.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:ddf33d97d2f52d89f6e6e7ae66ee35a4d9ca6f36eda89c24591b0c40205a3629"}, + {file = "pyzmq-26.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:dacd995031a01d16eec825bf30802fceb2c3791ef24bcce48fa98ce40918c27b"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:89289a5ee32ef6c439086184529ae060c741334b8970a6855ec0b6ad3ff28764"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5506f06d7dc6ecf1efacb4a013b1f05071bb24b76350832c96449f4a2d95091c"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ea039387c10202ce304af74def5021e9adc6297067f3441d348d2b633e8166a"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a2224fa4a4c2ee872886ed00a571f5e967c85e078e8e8c2530a2fb01b3309b88"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:28ad5233e9c3b52d76196c696e362508959741e1a005fb8fa03b51aea156088f"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:1c17211bc037c7d88e85ed8b7d8f7e52db6dc8eca5590d162717c654550f7282"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b8f86dd868d41bea9a5f873ee13bf5551c94cf6bc51baebc6f85075971fe6eea"}, + {file = "pyzmq-26.2.0-cp310-cp310-win32.whl", hash = "sha256:46a446c212e58456b23af260f3d9fb785054f3e3653dbf7279d8f2b5546b21c2"}, + {file = "pyzmq-26.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:49d34ab71db5a9c292a7644ce74190b1dd5a3475612eefb1f8be1d6961441971"}, + {file = "pyzmq-26.2.0-cp310-cp310-win_arm64.whl", hash = "sha256:bfa832bfa540e5b5c27dcf5de5d82ebc431b82c453a43d141afb1e5d2de025fa"}, + {file = "pyzmq-26.2.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:8f7e66c7113c684c2b3f1c83cdd3376103ee0ce4c49ff80a648643e57fb22218"}, + {file = "pyzmq-26.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3a495b30fc91db2db25120df5847d9833af237546fd59170701acd816ccc01c4"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77eb0968da535cba0470a5165468b2cac7772cfb569977cff92e240f57e31bef"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ace4f71f1900a548f48407fc9be59c6ba9d9aaf658c2eea6cf2779e72f9f317"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:92a78853d7280bffb93df0a4a6a2498cba10ee793cc8076ef797ef2f74d107cf"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:689c5d781014956a4a6de61d74ba97b23547e431e9e7d64f27d4922ba96e9d6e"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0aca98bc423eb7d153214b2df397c6421ba6373d3397b26c057af3c904452e37"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:1f3496d76b89d9429a656293744ceca4d2ac2a10ae59b84c1da9b5165f429ad3"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5c2b3bfd4b9689919db068ac6c9911f3fcb231c39f7dd30e3138be94896d18e6"}, + {file = "pyzmq-26.2.0-cp311-cp311-win32.whl", hash = "sha256:eac5174677da084abf378739dbf4ad245661635f1600edd1221f150b165343f4"}, + {file = "pyzmq-26.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:5a509df7d0a83a4b178d0f937ef14286659225ef4e8812e05580776c70e155d5"}, + {file = "pyzmq-26.2.0-cp311-cp311-win_arm64.whl", hash = "sha256:c0e6091b157d48cbe37bd67233318dbb53e1e6327d6fc3bb284afd585d141003"}, + {file = "pyzmq-26.2.0-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:ded0fc7d90fe93ae0b18059930086c51e640cdd3baebdc783a695c77f123dcd9"}, + {file = "pyzmq-26.2.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:17bf5a931c7f6618023cdacc7081f3f266aecb68ca692adac015c383a134ca52"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55cf66647e49d4621a7e20c8d13511ef1fe1efbbccf670811864452487007e08"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4661c88db4a9e0f958c8abc2b97472e23061f0bc737f6f6179d7a27024e1faa5"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ea7f69de383cb47522c9c208aec6dd17697db7875a4674c4af3f8cfdac0bdeae"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:7f98f6dfa8b8ccaf39163ce872bddacca38f6a67289116c8937a02e30bbe9711"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e3e0210287329272539eea617830a6a28161fbbd8a3271bf4150ae3e58c5d0e6"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:6b274e0762c33c7471f1a7471d1a2085b1a35eba5cdc48d2ae319f28b6fc4de3"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:29c6a4635eef69d68a00321e12a7d2559fe2dfccfa8efae3ffb8e91cd0b36a8b"}, + {file = "pyzmq-26.2.0-cp312-cp312-win32.whl", hash = "sha256:989d842dc06dc59feea09e58c74ca3e1678c812a4a8a2a419046d711031f69c7"}, + {file = "pyzmq-26.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:2a50625acdc7801bc6f74698c5c583a491c61d73c6b7ea4dee3901bb99adb27a"}, + {file = "pyzmq-26.2.0-cp312-cp312-win_arm64.whl", hash = "sha256:4d29ab8592b6ad12ebbf92ac2ed2bedcfd1cec192d8e559e2e099f648570e19b"}, + {file = "pyzmq-26.2.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9dd8cd1aeb00775f527ec60022004d030ddc51d783d056e3e23e74e623e33726"}, + {file = "pyzmq-26.2.0-cp313-cp313-macosx_10_15_universal2.whl", hash = "sha256:28c812d9757fe8acecc910c9ac9dafd2ce968c00f9e619db09e9f8f54c3a68a3"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d80b1dd99c1942f74ed608ddb38b181b87476c6a966a88a950c7dee118fdf50"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c997098cc65e3208eca09303630e84d42718620e83b733d0fd69543a9cab9cb"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ad1bc8d1b7a18497dda9600b12dc193c577beb391beae5cd2349184db40f187"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:bea2acdd8ea4275e1278350ced63da0b166421928276c7c8e3f9729d7402a57b"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:23f4aad749d13698f3f7b64aad34f5fc02d6f20f05999eebc96b89b01262fb18"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:a4f96f0d88accc3dbe4a9025f785ba830f968e21e3e2c6321ccdfc9aef755115"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ced65e5a985398827cc9276b93ef6dfabe0273c23de8c7931339d7e141c2818e"}, + {file = "pyzmq-26.2.0-cp313-cp313-win32.whl", hash = "sha256:31507f7b47cc1ead1f6e86927f8ebb196a0bab043f6345ce070f412a59bf87b5"}, + {file = "pyzmq-26.2.0-cp313-cp313-win_amd64.whl", hash = "sha256:70fc7fcf0410d16ebdda9b26cbd8bf8d803d220a7f3522e060a69a9c87bf7bad"}, + {file = "pyzmq-26.2.0-cp313-cp313-win_arm64.whl", hash = "sha256:c3789bd5768ab5618ebf09cef6ec2b35fed88709b104351748a63045f0ff9797"}, + {file = "pyzmq-26.2.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:034da5fc55d9f8da09015d368f519478a52675e558c989bfcb5cf6d4e16a7d2a"}, + {file = "pyzmq-26.2.0-cp313-cp313t-macosx_10_15_universal2.whl", hash = "sha256:c92d73464b886931308ccc45b2744e5968cbaade0b1d6aeb40d8ab537765f5bc"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:794a4562dcb374f7dbbfb3f51d28fb40123b5a2abadee7b4091f93054909add5"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aee22939bb6075e7afededabad1a56a905da0b3c4e3e0c45e75810ebe3a52672"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ae90ff9dad33a1cfe947d2c40cb9cb5e600d759ac4f0fd22616ce6540f72797"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:43a47408ac52647dfabbc66a25b05b6a61700b5165807e3fbd40063fcaf46386"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:25bf2374a2a8433633c65ccb9553350d5e17e60c8eb4de4d92cc6bd60f01d306"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_i686.whl", hash = "sha256:007137c9ac9ad5ea21e6ad97d3489af654381324d5d3ba614c323f60dab8fae6"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:470d4a4f6d48fb34e92d768b4e8a5cc3780db0d69107abf1cd7ff734b9766eb0"}, + {file = "pyzmq-26.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3b55a4229ce5da9497dd0452b914556ae58e96a4381bb6f59f1305dfd7e53fc8"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9cb3a6460cdea8fe8194a76de8895707e61ded10ad0be97188cc8463ffa7e3a8"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8ab5cad923cc95c87bffee098a27856c859bd5d0af31bd346035aa816b081fe1"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ed69074a610fad1c2fda66180e7b2edd4d31c53f2d1872bc2d1211563904cd9"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:cccba051221b916a4f5e538997c45d7d136a5646442b1231b916d0164067ea27"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:0eaa83fc4c1e271c24eaf8fb083cbccef8fde77ec8cd45f3c35a9a123e6da097"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:9edda2df81daa129b25a39b86cb57dfdfe16f7ec15b42b19bfac503360d27a93"}, + {file = "pyzmq-26.2.0-cp37-cp37m-win32.whl", hash = "sha256:ea0eb6af8a17fa272f7b98d7bebfab7836a0d62738e16ba380f440fceca2d951"}, + {file = "pyzmq-26.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:4ff9dc6bc1664bb9eec25cd17506ef6672d506115095411e237d571e92a58231"}, + {file = "pyzmq-26.2.0-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:2eb7735ee73ca1b0d71e0e67c3739c689067f055c764f73aac4cc8ecf958ee3f"}, + {file = "pyzmq-26.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a534f43bc738181aa7cbbaf48e3eca62c76453a40a746ab95d4b27b1111a7d2"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:aedd5dd8692635813368e558a05266b995d3d020b23e49581ddd5bbe197a8ab6"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8be4700cd8bb02cc454f630dcdf7cfa99de96788b80c51b60fe2fe1dac480289"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fcc03fa4997c447dce58264e93b5aa2d57714fbe0f06c07b7785ae131512732"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:402b190912935d3db15b03e8f7485812db350d271b284ded2b80d2e5704be780"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8685fa9c25ff00f550c1fec650430c4b71e4e48e8d852f7ddcf2e48308038640"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:76589c020680778f06b7e0b193f4b6dd66d470234a16e1df90329f5e14a171cd"}, + {file = "pyzmq-26.2.0-cp38-cp38-win32.whl", hash = "sha256:8423c1877d72c041f2c263b1ec6e34360448decfb323fa8b94e85883043ef988"}, + {file = "pyzmq-26.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:76589f2cd6b77b5bdea4fca5992dc1c23389d68b18ccc26a53680ba2dc80ff2f"}, + {file = "pyzmq-26.2.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:b1d464cb8d72bfc1a3adc53305a63a8e0cac6bc8c5a07e8ca190ab8d3faa43c2"}, + {file = "pyzmq-26.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4da04c48873a6abdd71811c5e163bd656ee1b957971db7f35140a2d573f6949c"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:d049df610ac811dcffdc147153b414147428567fbbc8be43bb8885f04db39d98"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:05590cdbc6b902101d0e65d6a4780af14dc22914cc6ab995d99b85af45362cc9"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c811cfcd6a9bf680236c40c6f617187515269ab2912f3d7e8c0174898e2519db"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:6835dd60355593de10350394242b5757fbbd88b25287314316f266e24c61d073"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc6bee759a6bddea5db78d7dcd609397449cb2d2d6587f48f3ca613b19410cfc"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c530e1eecd036ecc83c3407f77bb86feb79916d4a33d11394b8234f3bd35b940"}, + {file = "pyzmq-26.2.0-cp39-cp39-win32.whl", hash = "sha256:367b4f689786fca726ef7a6c5ba606958b145b9340a5e4808132cc65759abd44"}, + {file = "pyzmq-26.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:e6fa2e3e683f34aea77de8112f6483803c96a44fd726d7358b9888ae5bb394ec"}, + {file = "pyzmq-26.2.0-cp39-cp39-win_arm64.whl", hash = "sha256:7445be39143a8aa4faec43b076e06944b8f9d0701b669df4af200531b21e40bb"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:706e794564bec25819d21a41c31d4df2d48e1cc4b061e8d345d7fb4dd3e94072"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b435f2753621cd36e7c1762156815e21c985c72b19135dac43a7f4f31d28dd1"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:160c7e0a5eb178011e72892f99f918c04a131f36056d10d9c1afb223fc952c2d"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c4a71d5d6e7b28a47a394c0471b7e77a0661e2d651e7ae91e0cab0a587859ca"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:90412f2db8c02a3864cbfc67db0e3dcdbda336acf1c469526d3e869394fe001c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2ea4ad4e6a12e454de05f2949d4beddb52460f3de7c8b9d5c46fbb7d7222e02c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:fc4f7a173a5609631bb0c42c23d12c49df3966f89f496a51d3eb0ec81f4519d6"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:878206a45202247781472a2d99df12a176fef806ca175799e1c6ad263510d57c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17c412bad2eb9468e876f556eb4ee910e62d721d2c7a53c7fa31e643d35352e6"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:0d987a3ae5a71c6226b203cfd298720e0086c7fe7c74f35fa8edddfbd6597eed"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:39887ac397ff35b7b775db7201095fc6310a35fdbae85bac4523f7eb3b840e20"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:fdb5b3e311d4d4b0eb8b3e8b4d1b0a512713ad7e6a68791d0923d1aec433d919"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:226af7dcb51fdb0109f0016449b357e182ea0ceb6b47dfb5999d569e5db161d5"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0bed0e799e6120b9c32756203fb9dfe8ca2fb8467fed830c34c877e25638c3fc"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:29c7947c594e105cb9e6c466bace8532dc1ca02d498684128b339799f5248277"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cdeabcff45d1c219636ee2e54d852262e5c2e085d6cb476d938aee8d921356b3"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:35cffef589bcdc587d06f9149f8d5e9e8859920a071df5a2671de2213bef592a"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:18c8dc3b7468d8b4bdf60ce9d7141897da103c7a4690157b32b60acb45e333e6"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7133d0a1677aec369d67dd78520d3fa96dd7f3dcec99d66c1762870e5ea1a50a"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:6a96179a24b14fa6428cbfc08641c779a53f8fcec43644030328f44034c7f1f4"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:4f78c88905461a9203eac9faac157a2a0dbba84a0fd09fd29315db27be40af9f"}, + {file = "pyzmq-26.2.0.tar.gz", hash = "sha256:070672c258581c8e4f640b5159297580a9974b026043bd4ab0470be9ed324f1f"}, ] [package.dependencies] cffi = {version = "*", markers = "implementation_name == \"pypy\""} -[[package]] -name = "qtpy" -version = "2.4.1" -description = "Provides an abstraction layer on top of the various Qt bindings (PyQt5/6 and PySide2/6)." -optional = false -python-versions = ">=3.7" -files = [ - {file = "QtPy-2.4.1-py3-none-any.whl", hash = "sha256:1c1d8c4fa2c884ae742b069151b0abe15b3f70491f3972698c683b8e38de839b"}, - {file = "QtPy-2.4.1.tar.gz", hash = "sha256:a5a15ffd519550a1361bdc56ffc07fda56a6af7292f17c7b395d4083af632987"}, -] - -[package.dependencies] -packaging = "*" - -[package.extras] -test = ["pytest (>=6,!=7.0.0,!=7.0.1)", "pytest-cov (>=3.0.0)", "pytest-qt"] - [[package]] name = "readchar" version = "4.2.0" @@ -8649,23 +8316,24 @@ files = [ [[package]] name = "realtimestt" -version = "0.1.16" +version = "0.2.41" description = "A fast Voice Activity Detection and Transcription System" optional = false python-versions = ">=3.6" files = [ - {file = "RealTimeSTT-0.1.16-py3-none-any.whl", hash = "sha256:8c0cf6b0db0b38b7cb6ba13fb68e8bade54b4636382758e80a4c7c3691f2cf89"}, - {file = "RealTimeSTT-0.1.16.tar.gz", hash = "sha256:efd6e3292ed28916456292cfeb9cf4956e896c3762b2abb89057e304db7a00e6"}, + {file = "RealtimeSTT-0.2.41-py3-none-any.whl", hash = "sha256:aa661a2e2f5f8cdf93086b5dcb34e79091911088e62fa70ca9f12b2e8d188e47"}, + {file = "RealtimeSTT-0.2.41.tar.gz", hash = "sha256:67cb411c13c6846fe544fd67f820b799cb294cfe1bb5b9be0e69cc107cf9fe5c"}, ] [package.dependencies] -faster-whisper = "1.0.2" +faster-whisper = "1.0.3" halo = "0.0.31" +openwakeword = "0.6.0" pvporcupine = "1.9.5" PyAudio = "0.2.14" scipy = "1.12.0" -torch = "2.3.0" -torchaudio = "2.3.0" +torch = "2.3.1" +torchaudio = "2.3.1" webrtcvad = "2.0.10" websockets = "v12.0" @@ -9243,9 +8911,6 @@ files = [ {file = "segno-1.6.1.tar.gz", hash = "sha256:f23da78b059251c36e210d0cf5bfb1a9ec1604ae6e9f3d42f9a7c16d306d847e"}, ] -[package.dependencies] -importlib-metadata = {version = ">=3.6.0", markers = "python_version < \"3.10\""} - [[package]] name = "send2trash" version = "1.8.3" @@ -9286,57 +8951,6 @@ transformers = ">=4.34.0,<5.0.0" [package.extras] dev = ["pre-commit", "pytest", "ruff (>=0.3.0)"] -[[package]] -name = "sentry-sdk" -version = "2.13.0" -description = "Python client for Sentry (https://sentry.io)" -optional = false -python-versions = ">=3.6" -files = [ - {file = "sentry_sdk-2.13.0-py2.py3-none-any.whl", hash = "sha256:6beede8fc2ab4043da7f69d95534e320944690680dd9a963178a49de71d726c6"}, - {file = "sentry_sdk-2.13.0.tar.gz", hash = "sha256:8d4a576f7a98eb2fdb40e13106e41f330e5c79d72a68be1316e7852cf4995260"}, -] - -[package.dependencies] -certifi = "*" -urllib3 = ">=1.26.11" - -[package.extras] -aiohttp = ["aiohttp (>=3.5)"] -anthropic = ["anthropic (>=0.16)"] -arq = ["arq (>=0.23)"] -asyncpg = ["asyncpg (>=0.23)"] -beam = ["apache-beam (>=2.12)"] -bottle = ["bottle (>=0.12.13)"] -celery = ["celery (>=3)"] -celery-redbeat = ["celery-redbeat (>=2)"] -chalice = ["chalice (>=1.16.0)"] -clickhouse-driver = ["clickhouse-driver (>=0.2.0)"] -django = ["django (>=1.8)"] -falcon = ["falcon (>=1.4)"] -fastapi = ["fastapi (>=0.79.0)"] -flask = ["blinker (>=1.1)", "flask (>=0.11)", "markupsafe"] -grpcio = ["grpcio (>=1.21.1)", "protobuf (>=3.8.0)"] -httpx = ["httpx (>=0.16.0)"] -huey = ["huey (>=2)"] -huggingface-hub = ["huggingface-hub (>=0.22)"] -langchain = ["langchain (>=0.0.210)"] -litestar = ["litestar (>=2.0.0)"] -loguru = ["loguru (>=0.5)"] -openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"] -opentelemetry = ["opentelemetry-distro (>=0.35b0)"] -opentelemetry-experimental = ["opentelemetry-distro"] -pure-eval = ["asttokens", "executing", "pure-eval"] -pymongo = ["pymongo (>=3.1)"] -pyspark = ["pyspark (>=2.4.4)"] -quart = ["blinker (>=1.1)", "quart (>=0.16.1)"] -rq = ["rq (>=0.6)"] -sanic = ["sanic (>=0.8)"] -sqlalchemy = ["sqlalchemy (>=1.2)"] -starlette = ["starlette (>=0.19.1)"] -starlite = ["starlite (>=1.48)"] -tornado = ["tornado (>=6)"] - [[package]] name = "setuptools" version = "73.0.1" @@ -9375,22 +8989,6 @@ files = [ {file = "shortuuid-1.0.13.tar.gz", hash = "sha256:3bb9cf07f606260584b1df46399c0b87dd84773e7b25912b7e391e30797c5e72"}, ] -[[package]] -name = "simpleaudio" -version = "1.0.4" -description = "Simple, asynchronous audio playback for Python 3." -optional = false -python-versions = "*" -files = [ - {file = "simpleaudio-1.0.4-cp37-cp37m-macosx_10_6_intel.whl", hash = "sha256:05b63da515f5fc7c6f40e4d9673d22239c5e03e2bda200fc09fd21c185d73713"}, - {file = "simpleaudio-1.0.4-cp37-cp37m-win32.whl", hash = "sha256:f1a4fe3358429b2ea3181fd782e4c4fff5c123ca86ec7fc29e01ee9acd8a227a"}, - {file = "simpleaudio-1.0.4-cp37-cp37m-win_amd64.whl", hash = "sha256:86f1b0985629852afe67259ac6c24905ca731cb202a6e96b818865c56ced0c27"}, - {file = "simpleaudio-1.0.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f68820297ad51577e3a77369e7e9b23989d30d5ae923bf34c92cf983c04ade04"}, - {file = "simpleaudio-1.0.4-cp38-cp38-win32.whl", hash = "sha256:67348e3d3ccbae73bd126beed7f1e242976889620dbc6974c36800cd286430fc"}, - {file = "simpleaudio-1.0.4-cp38-cp38-win_amd64.whl", hash = "sha256:f346a4eac9cdbb1b3f3d0995095b7e86c12219964c022f4d920c22f6ca05fb4c"}, - {file = "simpleaudio-1.0.4.tar.gz", hash = "sha256:691c88649243544db717e7edf6a9831df112104e1aefb5f6038a5d071e8cf41d"}, -] - [[package]] name = "six" version = "1.16.0" @@ -9720,7 +9318,6 @@ files = [ [package.dependencies] anyio = ">=3.4.0,<5" -typing-extensions = {version = ">=3.10.0", markers = "python_version < \"3.10\""} [package.extras] full = ["httpx (>=0.22.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.7)", "pyyaml"] @@ -9865,23 +9462,28 @@ files = [ tests = ["pytest", "pytest-cov"] [[package]] -name = "textual" -version = "0.50.1" -description = "Modern Text User Interface framework" +name = "tflite-runtime" +version = "2.14.0" +description = "TensorFlow Lite is for mobile and embedded devices." optional = false -python-versions = ">=3.8,<4.0" +python-versions = "*" files = [ - {file = "textual-0.50.1-py3-none-any.whl", hash = "sha256:11bd87fe6c543358122c43db2e9dfc5940900ef9b8975502ab7043792928638b"}, - {file = "textual-0.50.1.tar.gz", hash = "sha256:415bef44b2dfa702d17ebb08637c0141eb54767cfbeafe60d07e62104183b56a"}, + {file = "tflite_runtime-2.14.0-cp310-cp310-manylinux2014_x86_64.whl", hash = "sha256:bb11df4283e281cd609c621ac9470ad0cb5674408593272d7593a2c6bde8a808"}, + {file = "tflite_runtime-2.14.0-cp310-cp310-manylinux_2_34_aarch64.whl", hash = "sha256:d38c6885f5e9673c11a61ccec5cad7c032ab97340718d26b17794137f398b780"}, + {file = "tflite_runtime-2.14.0-cp310-cp310-manylinux_2_34_armv7l.whl", hash = "sha256:7fe33f763263d1ff2733a09945a7547ab063d8bc311fd2a1be8144d850016ad3"}, + {file = "tflite_runtime-2.14.0-cp311-cp311-manylinux2014_x86_64.whl", hash = "sha256:195ab752e7e57329a68e54dd3dd5439fad888b9bff1be0f0dc042a3237a90e4d"}, + {file = "tflite_runtime-2.14.0-cp311-cp311-manylinux_2_34_aarch64.whl", hash = "sha256:ce9fa5d770a9725c746dcbf6f59f3178233b3759f09982e8b2db8d2234c333b0"}, + {file = "tflite_runtime-2.14.0-cp311-cp311-manylinux_2_34_armv7l.whl", hash = "sha256:c4e66a74165b18089c86788400af19fa551768ac782d231a9beae2f6434f7949"}, + {file = "tflite_runtime-2.14.0-cp38-cp38-manylinux2014_x86_64.whl", hash = "sha256:9f965054467f7890e678943858c6ac76a5197b17f61b48dcbaaba0af41d541a7"}, + {file = "tflite_runtime-2.14.0-cp38-cp38-manylinux_2_34_aarch64.whl", hash = "sha256:437167fe3d8b12f50f5d694da8f45d268ab84a495e24c3dd810e02e1012125de"}, + {file = "tflite_runtime-2.14.0-cp38-cp38-manylinux_2_34_armv7l.whl", hash = "sha256:79d8e17f68cc940df7e68a177b22dda60fcffba195fb9dd908d03724d65fd118"}, + {file = "tflite_runtime-2.14.0-cp39-cp39-manylinux2014_x86_64.whl", hash = "sha256:4aa740210a0fd9e4db4a46e9778914846b136e161525681b41575ca4896158fb"}, + {file = "tflite_runtime-2.14.0-cp39-cp39-manylinux_2_34_aarch64.whl", hash = "sha256:be198b7dc4401204be54a15884d9e336389790eb707439524540f5a9329fdd02"}, + {file = "tflite_runtime-2.14.0-cp39-cp39-manylinux_2_34_armv7l.whl", hash = "sha256:eca7672adca32727bbf5c0f1caf398fc17bbe222f2a684c7a2caea6fc6767203"}, ] [package.dependencies] -markdown-it-py = {version = ">=2.1.0", extras = ["linkify", "plugins"]} -rich = ">=13.3.3" -typing-extensions = ">=4.4.0,<5.0.0" - -[package.extras] -syntax = ["tree-sitter (>=0.20.1,<0.21.0)", "tree_sitter_languages (>=1.7.0)"] +numpy = ">=1.23.2" [[package]] name = "thinc" @@ -10188,31 +9790,31 @@ files = [ [[package]] name = "torch" -version = "2.3.0" +version = "2.3.1" description = "Tensors and Dynamic neural networks in Python with strong GPU acceleration" optional = false python-versions = ">=3.8.0" files = [ - {file = "torch-2.3.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:d8ea5a465dbfd8501f33c937d1f693176c9aef9d1c1b0ca1d44ed7b0a18c52ac"}, - {file = "torch-2.3.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:09c81c5859a5b819956c6925a405ef1cdda393c9d8a01ce3851453f699d3358c"}, - {file = "torch-2.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:1bf023aa20902586f614f7682fedfa463e773e26c58820b74158a72470259459"}, - {file = "torch-2.3.0-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:758ef938de87a2653bba74b91f703458c15569f1562bf4b6c63c62d9c5a0c1f5"}, - {file = "torch-2.3.0-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:493d54ee2f9df100b5ce1d18c96dbb8d14908721f76351e908c9d2622773a788"}, - {file = "torch-2.3.0-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:bce43af735c3da16cc14c7de2be7ad038e2fbf75654c2e274e575c6c05772ace"}, - {file = "torch-2.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:729804e97b7cf19ae9ab4181f91f5e612af07956f35c8b2c8e9d9f3596a8e877"}, - {file = "torch-2.3.0-cp311-none-macosx_11_0_arm64.whl", hash = "sha256:d24e328226d8e2af7cf80fcb1d2f1d108e0de32777fab4aaa2b37b9765d8be73"}, - {file = "torch-2.3.0-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:b0de2bdc0486ea7b14fc47ff805172df44e421a7318b7c4d92ef589a75d27410"}, - {file = "torch-2.3.0-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:a306c87a3eead1ed47457822c01dfbd459fe2920f2d38cbdf90de18f23f72542"}, - {file = "torch-2.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:f9b98bf1a3c8af2d4c41f0bf1433920900896c446d1ddc128290ff146d1eb4bd"}, - {file = "torch-2.3.0-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:dca986214267b34065a79000cee54232e62b41dff1ec2cab9abc3fc8b3dee0ad"}, - {file = "torch-2.3.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:20572f426965dd8a04e92a473d7e445fa579e09943cc0354f3e6fef6130ce061"}, - {file = "torch-2.3.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:e65ba85ae292909cde0dde6369826d51165a3fc8823dc1854cd9432d7f79b932"}, - {file = "torch-2.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:5515503a193781fd1b3f5c474e89c9dfa2faaa782b2795cc4a7ab7e67de923f6"}, - {file = "torch-2.3.0-cp38-none-macosx_11_0_arm64.whl", hash = "sha256:6ae9f64b09516baa4ef890af0672dc981c20b1f0d829ce115d4420a247e88fba"}, - {file = "torch-2.3.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:cd0dc498b961ab19cb3f8dbf0c6c50e244f2f37dbfa05754ab44ea057c944ef9"}, - {file = "torch-2.3.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:e05f836559251e4096f3786ee99f4a8cbe67bc7fbedba8ad5e799681e47c5e80"}, - {file = "torch-2.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:4fb27b35dbb32303c2927da86e27b54a92209ddfb7234afb1949ea2b3effffea"}, - {file = "torch-2.3.0-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:760f8bedff506ce9e6e103498f9b1e9e15809e008368594c3a66bf74a8a51380"}, + {file = "torch-2.3.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:605a25b23944be5ab7c3467e843580e1d888b8066e5aaf17ff7bf9cc30001cc3"}, + {file = "torch-2.3.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:f2357eb0965583a0954d6f9ad005bba0091f956aef879822274b1bcdb11bd308"}, + {file = "torch-2.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:32b05fe0d1ada7f69c9f86c14ff69b0ef1957a5a54199bacba63d22d8fab720b"}, + {file = "torch-2.3.1-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:7c09a94362778428484bcf995f6004b04952106aee0ef45ff0b4bab484f5498d"}, + {file = "torch-2.3.1-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:b2ec81b61bb094ea4a9dee1cd3f7b76a44555375719ad29f05c0ca8ef596ad39"}, + {file = "torch-2.3.1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:490cc3d917d1fe0bd027057dfe9941dc1d6d8e3cae76140f5dd9a7e5bc7130ab"}, + {file = "torch-2.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:5802530783bd465fe66c2df99123c9a54be06da118fbd785a25ab0a88123758a"}, + {file = "torch-2.3.1-cp311-none-macosx_11_0_arm64.whl", hash = "sha256:a7dd4ed388ad1f3d502bf09453d5fe596c7b121de7e0cfaca1e2017782e9bbac"}, + {file = "torch-2.3.1-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:a486c0b1976a118805fc7c9641d02df7afbb0c21e6b555d3bb985c9f9601b61a"}, + {file = "torch-2.3.1-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:224259821fe3e4c6f7edf1528e4fe4ac779c77addaa74215eb0b63a5c474d66c"}, + {file = "torch-2.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:e5fdccbf6f1334b2203a61a0e03821d5845f1421defe311dabeae2fc8fbeac2d"}, + {file = "torch-2.3.1-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:3c333dc2ebc189561514eda06e81df22bf8fb64e2384746b2cb9f04f96d1d4c8"}, + {file = "torch-2.3.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:07e9ba746832b8d069cacb45f312cadd8ad02b81ea527ec9766c0e7404bb3feb"}, + {file = "torch-2.3.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:462d1c07dbf6bb5d9d2f3316fee73a24f3d12cd8dacf681ad46ef6418f7f6626"}, + {file = "torch-2.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:ff60bf7ce3de1d43ad3f6969983f321a31f0a45df3690921720bcad6a8596cc4"}, + {file = "torch-2.3.1-cp38-none-macosx_11_0_arm64.whl", hash = "sha256:bee0bd33dc58aa8fc8a7527876e9b9a0e812ad08122054a5bff2ce5abf005b10"}, + {file = "torch-2.3.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:aaa872abde9a3d4f91580f6396d54888620f4a0b92e3976a6034759df4b961ad"}, + {file = "torch-2.3.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:3d7a7f7ef21a7520510553dc3938b0c57c116a7daee20736a9e25cbc0e832bdc"}, + {file = "torch-2.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:4777f6cefa0c2b5fa87223c213e7b6f417cf254a45e5829be4ccd1b2a4ee1011"}, + {file = "torch-2.3.1-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:2bb5af780c55be68fe100feb0528d2edebace1d55cb2e351de735809ba7391eb"}, ] [package.dependencies] @@ -10233,7 +9835,7 @@ nvidia-cusparse-cu12 = {version = "12.1.0.106", markers = "platform_system == \" nvidia-nccl-cu12 = {version = "2.20.5", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""} nvidia-nvtx-cu12 = {version = "12.1.105", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""} sympy = "*" -triton = {version = "2.3.0", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\" and python_version < \"3.12\""} +triton = {version = "2.3.1", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\" and python_version < \"3.12\""} typing-extensions = ">=4.8.0" [package.extras] @@ -10242,69 +9844,69 @@ optree = ["optree (>=0.9.1)"] [[package]] name = "torchaudio" -version = "2.3.0" +version = "2.3.1" description = "An audio package for PyTorch" optional = false python-versions = "*" files = [ - {file = "torchaudio-2.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:342108da83aa19a457c9a128b1206fadb603753b51cca022b9f585aac2f4754c"}, - {file = "torchaudio-2.3.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:73fedb2c631e01fa10feaac308540b836aefe758e55ca3ee026335e5d01e8e30"}, - {file = "torchaudio-2.3.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:e5bb50b7a4874ed97086c9e516dd90b103d954edcb5ed4b36f4fc22c4000a5a7"}, - {file = "torchaudio-2.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:b4cc9cef5c98ed37e9405c4e0b0e6413bc101f3f49d45dc4f1d4e927757fe41e"}, - {file = "torchaudio-2.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:341ca3048ce6edcc731519b30187f0b13acb245c4efe16f925f69f9d533546e1"}, - {file = "torchaudio-2.3.0-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:8f2e0a28740bb0ee66369f92c811f33c0a47e6fcfc2de9cee89746472d713906"}, - {file = "torchaudio-2.3.0-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:61edb02ae9c0efea4399f9c1f899601136b24f35d430548284ea8eaf6ccbe3be"}, - {file = "torchaudio-2.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:04bc960cf1aef3b469b095a432a25496bc28197850fc2d90b7b52d6b5255487b"}, - {file = "torchaudio-2.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:535144a2fbba95fbb3b883224ffcf44788e4cecbabbe49c4a1ae3e7a74f71485"}, - {file = "torchaudio-2.3.0-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:fb3f52ed1d63b272c240d9bf051705312cb172212051b8a6a2f64d42e3cc1633"}, - {file = "torchaudio-2.3.0-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:668a8b694e5522cff28cd5e02d01aa1b75ce940aa9fb40480892bdc623b1735d"}, - {file = "torchaudio-2.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:6c1f538018b85d7766835d042e555de2f096f7a69bba6b16031bf42a914dd9e1"}, - {file = "torchaudio-2.3.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7ba93265455dc363385e98c0cfcaeb586b7401af8a2c824811ee1466134a4f30"}, - {file = "torchaudio-2.3.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:21bb6d1b384fc8895133f01489133d575d4a715cd81734b89651fb0264bd8b80"}, - {file = "torchaudio-2.3.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:ed1866f508dc689c4f682d330b2ed4c83108d35865e4fb89431819364d8ad9ed"}, - {file = "torchaudio-2.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:a3cbb230e2bb38ad1a1dd74aea242a154a9f76ab819d9c058b2c5074a9f5d7d2"}, - {file = "torchaudio-2.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f4b933776f20a36af5ddc57968fcb3da34dd03881db8d6760f3e1176803b9cf8"}, - {file = "torchaudio-2.3.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:c5e63cc2dbf179088b6cdfd21ecdbb943aa003c780075aa440162f231ee72db2"}, - {file = "torchaudio-2.3.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:d243bb8a1ee263c2cdafb9feed1569c3742d8135731e8f7818de12f4e0c83e28"}, - {file = "torchaudio-2.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:6cd6d45cf8a45c89953e35434d9a461feb418e51e760adafc606a903dcbb9bd5"}, + {file = "torchaudio-2.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1f9134b27e5a7f0c1e33382fc0fe278e53695768cb0af02e8d22b5006c74a2ad"}, + {file = "torchaudio-2.3.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:88796183c12631dbc3dca58a74625e2fb6c5c7e50a54649df14239439d874ba6"}, + {file = "torchaudio-2.3.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:6b57e773aad72743d50a64a7402a06cb8bdfcc709efc6d8c26429d940e6788e2"}, + {file = "torchaudio-2.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:5b1224f944d1a3fc9755bd2876df6824a42c60cf4f32a05426dfdcd9668466da"}, + {file = "torchaudio-2.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:01984f38398ca5e98ecfbfeafb72ae5b2131d0bb8aa464b5777addb3e4826877"}, + {file = "torchaudio-2.3.1-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:68815815e09105fe1171f0541681a7ebaf6d5d52b8e095ccde94b8064b107002"}, + {file = "torchaudio-2.3.1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:c8c727c8341825bd18d91017c4c00f36b53b08f2176cdb9bdcb0def1c450b21d"}, + {file = "torchaudio-2.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:341e33450831146bc4c4cc8191d94484f1acc8bb566c2463a57c4133f792464e"}, + {file = "torchaudio-2.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5e36685420a07a176146e9d6e0fa8225198f126e167a00785538f853807e2d43"}, + {file = "torchaudio-2.3.1-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:07b72d76fa108ac0f3400a759456ba96bdaa2b8649fd9588cc93295a532b01d9"}, + {file = "torchaudio-2.3.1-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:42af6c7a430e6268f2c028e06078d413912b5ec6efa28a097ebdd3c3c79659df"}, + {file = "torchaudio-2.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:25bd1137e47de96b48ef0dc4865bc620a0b759e44c009c7e78e92d7bfdf257ba"}, + {file = "torchaudio-2.3.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ce45e05acd544696c6a6f023d4fe8614ade57515799a1103b2418e854838d4a5"}, + {file = "torchaudio-2.3.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:6f8bc958ce1f24346dabe00d42e816f9b51698c00afe52492914761103e617a9"}, + {file = "torchaudio-2.3.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:9fd0f4bbc3fd585fbd7d976a988fe6e783fcb2e0db9d70dac60f40be072c6504"}, + {file = "torchaudio-2.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:d4982f4c520e49628507e968fb29c5db707108a8580b11593f049a932c8f2b98"}, + {file = "torchaudio-2.3.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:36e8c0b6532571c27a08a40dae428cd34af225007f15bcd77272643b6266b81d"}, + {file = "torchaudio-2.3.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:ae22a402fa862f7c3c177916f1b17482641d96b8bec56937e7df10739f3e3947"}, + {file = "torchaudio-2.3.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:4e3bca232f820c6a0fa5394424076cc519fae32288e7ff6f6d68bd71794dc354"}, + {file = "torchaudio-2.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:b7e0758b217e397bf2addfdc2df7c21f7dc34641968597a2a7e279c16e7c6d0b"}, ] [package.dependencies] -torch = "2.3.0" +torch = "2.3.1" [[package]] name = "torchvision" -version = "0.18.0" +version = "0.18.1" description = "image and video datasets and models for torch deep learning" optional = false python-versions = ">=3.8" files = [ - {file = "torchvision-0.18.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:dd61628a3d189c6852a12dc5ed4cd2eece66d2d67f35a866cb16f1dcb06c8c62"}, - {file = "torchvision-0.18.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:493c45f9937dad37aa1b64b14da17c7a589c72b91adc4837d431009cfe29bd53"}, - {file = "torchvision-0.18.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:5337f6acfa1fe959d5cb340d01a00614d6b31ce7a4824ccb95435a85c5273b95"}, - {file = "torchvision-0.18.0-cp310-cp310-win_amd64.whl", hash = "sha256:bd8e6f3b5beb49965f15c461302488edfa3d8c2d01d3bb79b150d6fb62711e3a"}, - {file = "torchvision-0.18.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6896a52168befe1105fb3c9335287390ed227e71d1e4ec4d68b62e8a3099fc09"}, - {file = "torchvision-0.18.0-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:3d7955398d4ceaad77c487c2c44f6f7813112402c9bab8cd906d346005891048"}, - {file = "torchvision-0.18.0-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:e5a24d620cea14a4bb89f24aa2b506230c0a16a3ada57fc53ad80cfd256a2128"}, - {file = "torchvision-0.18.0-cp311-cp311-win_amd64.whl", hash = "sha256:6ad70ddfa879bda5ed886b2518fe562640e0059787cbd65cb2bffa7674541410"}, - {file = "torchvision-0.18.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:eb9d83c0e1dbb54ecb0fb04c87f786333e3a6fb8b9c400aca7c31081f9aa5707"}, - {file = "torchvision-0.18.0-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:b657d052d146f24cb3b2a78219bfc82ae70a9706671c50f632528907d10cccec"}, - {file = "torchvision-0.18.0-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:a964afbc7ddf50a46b941477f6c35729b416deedd139756befd488245e2e226d"}, - {file = "torchvision-0.18.0-cp312-cp312-win_amd64.whl", hash = "sha256:7c770f0f748e0b17f57c0297508d7254f686cdf03fc2e2949f422b20574f4c0f"}, - {file = "torchvision-0.18.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2115a1906c015f5da9ceedc40a983313b0fd6e2c8a17108a92991706f51f6987"}, - {file = "torchvision-0.18.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:6323f7e5423ff2594d5891863b919deb9d0de95f01c36bf26fbd879036b6ed08"}, - {file = "torchvision-0.18.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:925d0a82cccf6f986c18b29b4392a942db65cbdb73c13a129c8493822eb9e36f"}, - {file = "torchvision-0.18.0-cp38-cp38-win_amd64.whl", hash = "sha256:95b42d0dc599b47a01530c7439a5751e67e45b85e3a67113989cf7c7c70f2039"}, - {file = "torchvision-0.18.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:75e22ecf44a13b8f95b8ad421c0261282d859c61816badaca1959e073ccdd691"}, - {file = "torchvision-0.18.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:4c334b3e719ba0a9ba6e15d4aff1178f5e6d029174f346163fed525f0ccfffd3"}, - {file = "torchvision-0.18.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:36efd87001c6bee2383e043e46a025affb03179747c8f4777b9918527ffce756"}, - {file = "torchvision-0.18.0-cp39-cp39-win_amd64.whl", hash = "sha256:ccc292e093771d5baacf5535ac4416306b6b5f15676341cd4d010d8542eace25"}, + {file = "torchvision-0.18.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3e694e54b0548dad99c12af6bf0c8e4f3350137d391dcd19af22a1c5f89322b3"}, + {file = "torchvision-0.18.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:0b3bda0aa5b416eeb547143b8eeaf17720bdba9cf516dc991aacb81811aa96a5"}, + {file = "torchvision-0.18.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:573ff523c739405edb085f65cb592f482d28a30e29b0be4c4ba08040b3ae785f"}, + {file = "torchvision-0.18.1-cp310-cp310-win_amd64.whl", hash = "sha256:ef7bbbc60b38e831a75e547c66ca1784f2ac27100f9e4ddbe9614cef6cbcd942"}, + {file = "torchvision-0.18.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:80b5d794dd0fdba787adc22f1a367a5ead452327686473cb260dd94364bc56a6"}, + {file = "torchvision-0.18.1-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:9077cf590cdb3a5e8fdf5cdb71797f8c67713f974cf0228ecb17fcd670ab42f9"}, + {file = "torchvision-0.18.1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:ceb993a882f1ae7ae373ed39c28d7e3e802205b0e59a7ed84ef4028f0bba8d7f"}, + {file = "torchvision-0.18.1-cp311-cp311-win_amd64.whl", hash = "sha256:52f7436140045dc2239cdc502aa76b2bd8bd676d64244ff154d304aa69852046"}, + {file = "torchvision-0.18.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2be6f0bf7c455c89a51a1dbb6f668d36c6edc479f49ac912d745d10df5715657"}, + {file = "torchvision-0.18.1-cp312-cp312-manylinux1_x86_64.whl", hash = "sha256:f118d887bfde3a948a41d56587525401e5cac1b7db2eaca203324d6ed2b1caca"}, + {file = "torchvision-0.18.1-cp312-cp312-manylinux2014_aarch64.whl", hash = "sha256:13d24d904f65e62d66a1e0c41faec630bc193867b8a4a01166769e8a8e8df8e9"}, + {file = "torchvision-0.18.1-cp312-cp312-win_amd64.whl", hash = "sha256:ed6340b69a63a625e512a66127210d412551d9c5f2ad2978130c6a45bf56cd4a"}, + {file = "torchvision-0.18.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b1c3864fa9378c88bce8ad0ef3599f4f25397897ce612e1c245c74b97092f35e"}, + {file = "torchvision-0.18.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:02085a2ffc7461f5c0edb07d6f3455ee1806561f37736b903da820067eea58c7"}, + {file = "torchvision-0.18.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:9726c316a2501df8503e5a5dc46a631afd4c515a958972e5b7f7b9c87d2125c0"}, + {file = "torchvision-0.18.1-cp38-cp38-win_amd64.whl", hash = "sha256:64a2662dbf30db9055d8b201d6e56f312a504e5ccd9d144c57c41622d3c524cb"}, + {file = "torchvision-0.18.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:975b8594c0f5288875408acbb74946eea786c5b008d129c0d045d0ead23742bc"}, + {file = "torchvision-0.18.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:da83c8bbd34d8bee48bfa1d1b40e0844bc3cba10ed825a5a8cbe3ce7b62264cd"}, + {file = "torchvision-0.18.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:54bfcd352abb396d5c9c237d200167c178bd136051b138e1e8ef46ce367c2773"}, + {file = "torchvision-0.18.1-cp39-cp39-win_amd64.whl", hash = "sha256:5c8366a1aeee49e9ea9e64b30d199debdf06b1bd7610a76165eb5d7869c3bde5"}, ] [package.dependencies] numpy = "*" pillow = ">=5.3.0,<8.3.dev0 || >=8.4.dev0" -torch = "2.3.0" +torch = "2.3.1" [package.extras] scipy = ["scipy"] @@ -10434,17 +10036,17 @@ vision = ["Pillow (>=10.0.1,<=15.0)"] [[package]] name = "triton" -version = "2.3.0" +version = "2.3.1" description = "A language and compiler for custom Deep Learning operations" optional = false python-versions = "*" files = [ - {file = "triton-2.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ce4b8ff70c48e47274c66f269cce8861cf1dc347ceeb7a67414ca151b1822d8"}, - {file = "triton-2.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c3d9607f85103afdb279938fc1dd2a66e4f5999a58eb48a346bd42738f986dd"}, - {file = "triton-2.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:218d742e67480d9581bafb73ed598416cc8a56f6316152e5562ee65e33de01c0"}, - {file = "triton-2.3.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:381ec6b3dac06922d3e4099cfc943ef032893b25415de295e82b1a82b0359d2c"}, - {file = "triton-2.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:038e06a09c06a164fef9c48de3af1e13a63dc1ba3c792871e61a8e79720ea440"}, - {file = "triton-2.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d8f636e0341ac348899a47a057c3daea99ea7db31528a225a3ba4ded28ccc65"}, + {file = "triton-2.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c84595cbe5e546b1b290d2a58b1494df5a2ef066dd890655e5b8a8a92205c33"}, + {file = "triton-2.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9d64ae33bcb3a7a18081e3a746e8cf87ca8623ca13d2c362413ce7a486f893e"}, + {file = "triton-2.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eaf80e8761a9e3498aa92e7bf83a085b31959c61f5e8ac14eedd018df6fccd10"}, + {file = "triton-2.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b13bf35a2b659af7159bf78e92798dc62d877aa991de723937329e2d382f1991"}, + {file = "triton-2.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63381e35ded3304704ea867ffde3b7cfc42c16a55b3062d41e017ef510433d66"}, + {file = "triton-2.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d968264523c7a07911c8fb51b4e0d1b920204dae71491b1fe7b01b62a31e124"}, ] [package.dependencies] @@ -10467,7 +10069,6 @@ files = [ ] [package.dependencies] -importlib-metadata = {version = ">=3.6", markers = "python_version < \"3.10\""} typing-extensions = ">=4.10.0" [package.extras] @@ -10476,13 +10077,13 @@ test = ["coverage[toml] (>=7)", "mypy (>=1.2.0)", "pytest (>=7)"] [[package]] name = "typer" -version = "0.12.4" +version = "0.12.5" description = "Typer, build great CLIs. Easy to code. Based on Python type hints." optional = false python-versions = ">=3.7" files = [ - {file = "typer-0.12.4-py3-none-any.whl", hash = "sha256:819aa03699f438397e876aa12b0d63766864ecba1b579092cc9fe35d886e34b6"}, - {file = "typer-0.12.4.tar.gz", hash = "sha256:c9c1613ed6a166162705b3347b8d10b661ccc5d95692654d0fb628118f2c34e6"}, + {file = "typer-0.12.5-py3-none-any.whl", hash = "sha256:62fe4e471711b147e3365034133904df3e235698399bc4de2b36c8579298d52b"}, + {file = "typer-0.12.5.tar.gz", hash = "sha256:f592f089bedcc8ec1b974125d64851029c3b1af145f04aca64d69410f0c9b722"}, ] [package.dependencies] @@ -10541,20 +10142,6 @@ tzdata = {version = "*", markers = "platform_system == \"Windows\""} [package.extras] devenv = ["check-manifest", "pytest (>=4.3)", "pytest-cov", "pytest-mock (>=3.3)", "zest.releaser"] -[[package]] -name = "uc-micro-py" -version = "1.0.3" -description = "Micro subset of unicode data files for linkify-it-py projects." -optional = false -python-versions = ">=3.7" -files = [ - {file = "uc-micro-py-1.0.3.tar.gz", hash = "sha256:d321b92cff673ec58027c04015fcaa8bb1e005478643ff4a500882eaab88c48a"}, - {file = "uc_micro_py-1.0.3-py3-none-any.whl", hash = "sha256:db1dffff340817673d7b466ec86114a9dc0e9d4d9b5ba229d9d60e5c12600cd5"}, -] - -[package.extras] -test = ["coverage", "pytest", "pytest-cov"] - [[package]] name = "umap-learn" version = "0.5.6" @@ -10982,13 +10569,13 @@ files = [ [[package]] name = "widgetsnbextension" -version = "4.0.11" +version = "4.0.13" description = "Jupyter interactive widgets for Jupyter Notebook" optional = false python-versions = ">=3.7" files = [ - {file = "widgetsnbextension-4.0.11-py3-none-any.whl", hash = "sha256:55d4d6949d100e0d08b94948a42efc3ed6dfdc0e9468b2c4b128c9a2ce3a7a36"}, - {file = "widgetsnbextension-4.0.11.tar.gz", hash = "sha256:8b22a8f1910bfd188e596fe7fc05dcbd87e810c8a4ba010bdb3da86637398474"}, + {file = "widgetsnbextension-4.0.13-py3-none-any.whl", hash = "sha256:74b2692e8500525cc38c2b877236ba51d34541e6385eeed5aec15a70f88a6c71"}, + {file = "widgetsnbextension-4.0.13.tar.gz", hash = "sha256:ffcb67bc9febd10234a362795f643927f4e0c05d9342c727b65d2384f8feacb6"}, ] [[package]] @@ -11215,5 +10802,5 @@ test = ["big-O", "importlib-resources", "jaraco.functools", "jaraco.itertools", [metadata] lock-version = "2.0" -python-versions = ">=3.9,<3.12" -content-hash = "fb72f95dbfa5da2e515705def9aa7f96bff4a6a5a3aca2801093aeefc386b82b" +python-versions = ">=3.10,<3.12" +content-hash = "45360be27a46c6c43047fbdf3e288965838c56b22f5553cbc7b4a43120263b6d" diff --git a/software/pyproject.toml b/software/pyproject.toml index d2ec725..5bd4af7 100644 --- a/software/pyproject.toml +++ b/software/pyproject.toml @@ -3,51 +3,15 @@ name = "01OS" packages = [ {include = "source"}, ] -include = ["start.py"] +include = ["main.py"] version = "0.0.14" -description = "The open-source language model computer" +description = "The #1 open-source voice interface for desktop, mobile, and ESP32 chips." authors = ["Killian "] license = "AGPL" readme = "../README.md" [tool.poetry.dependencies] -python = ">=3.9,<3.12" -pyaudio = "^0.2.14" -pynput = "^1.7.6" -websockets = "^12.0" -python-dotenv = "^1.0.1" -ffmpeg-python = "^0.2.0" -textual = "^0.50.1" -pydub = "^0.25.1" -ngrok = "^1.0.0" -simpleaudio = "^1.0.4" -opencv-python = "^4.9.0.80" -psutil = "^5.9.8" -platformdirs = "^4.2.0" -rich = "^13.7.1" -pytimeparse = "^1.1.8" -python-crontab = "^3.0.0" -inquirer = "^3.2.4" -pyqrcode = "^1.2.1" -realtimestt = "^0.1.16" -realtimetts = { version = "^0.4.2", extras = ["all"] } -keyboard = "^0.13.5" -pyautogui = "^0.9.54" -ctranslate2 = "4.1.0" -#py3-tts = "^3.5" -#elevenlabs = "1.2.2" -groq = "^0.5.0" -open-interpreter = {git = "https://github.com/OpenInterpreter/open-interpreter.git", branch = "development", extras = ["os", "server"]} -litellm = "*" -openai = "*" -pywebview = "*" -pyobjc = "*" -sentry-sdk = "^2.4.0" -plyer = "^2.1.0" -pywinctl = "^0.3" -certifi = "^2024.7.4" -pygame = "^2.6.0" -mpv = "^1.0.7" +python = ">=3.10,<3.12" livekit = "^0.12.1" livekit-agents = "^0.8.6" livekit-plugins-deepgram = "^0.6.5" @@ -55,13 +19,19 @@ livekit-plugins-openai = "^0.8.1" livekit-plugins-silero = "^0.6.4" livekit-plugins-elevenlabs = "^0.7.3" segno = "^1.6.1" +open-interpreter = {extras = ["os", "server"], version = "^0.3.8"} +ngrok = "^1.4.0" +realtimetts = {extras = ["all"], version = "^0.4.5"} +realtimestt = "^0.2.41" +pynput = "^1.7.7" +yaspin = "^3.0.2" [build-system] requires = ["poetry-core"] build-backend = "poetry.core.masonry.api" [tool.poetry.scripts] -01 = "start:app" +01 = "main:app" [tool.poetry.group.dev.dependencies] black = "^24.3.0" diff --git a/software/pytest.ini b/software/pytest.ini deleted file mode 100644 index acff36e..0000000 --- a/software/pytest.ini +++ /dev/null @@ -1,10 +0,0 @@ -; Config for Pytest Runner. -; suppress Deprecation Warning and User Warning to not spam the interface, but check periodically - -[pytest] -python_files = tests.py test_*.py -filterwarnings = - ignore::UserWarning - ignore::DeprecationWarning -log_cli = true -log_cli_level = INFO diff --git a/software/source/clients/archive_base_device.py b/software/source/clients/archive_base_device.py deleted file mode 100644 index 4810488..0000000 --- a/software/source/clients/archive_base_device.py +++ /dev/null @@ -1,482 +0,0 @@ -from dotenv import load_dotenv - -load_dotenv() # take environment variables from .env. - -import subprocess -import os -import sys -import asyncio -import threading -import pyaudio -from pynput import keyboard -import json -import traceback -import websockets -import queue -from pydub import AudioSegment -from pydub.playback import play -import time -import wave -import tempfile -from datetime import datetime -import cv2 -import base64 -import platform -from interpreter import ( - interpreter, -) # Just for code execution. Maybe we should let people do from interpreter.computer import run? - -# In the future, I guess kernel watching code should be elsewhere? Somewhere server / client agnostic? -from ..server.utils.kernel import put_kernel_messages_into_queue -from ..server.utils.get_system_info import get_system_info -from ..server.utils.process_utils import kill_process_tree - -from ..server.utils.logs import setup_logging -from ..server.utils.logs import logger - -setup_logging() - -os.environ["STT_RUNNER"] = "server" -os.environ["TTS_RUNNER"] = "server" - -from ..utils.accumulator import Accumulator - -accumulator = Accumulator() - -# Configuration for Audio Recording -CHUNK = 1024 # Record in chunks of 1024 samples -FORMAT = pyaudio.paInt16 # 16 bits per sample -CHANNELS = 1 # Mono -RATE = 16000 # Sample rate -RECORDING = False # Flag to control recording state -SPACEBAR_PRESSED = False # Flag to track spacebar press state - -# Camera configuration -CAMERA_ENABLED = os.getenv("CAMERA_ENABLED", False) -if type(CAMERA_ENABLED) == str: - CAMERA_ENABLED = CAMERA_ENABLED.lower() == "true" -CAMERA_DEVICE_INDEX = int(os.getenv("CAMERA_DEVICE_INDEX", 0)) -CAMERA_WARMUP_SECONDS = float(os.getenv("CAMERA_WARMUP_SECONDS", 0)) - -# Specify OS -current_platform = get_system_info() - - -def is_win11(): - return sys.getwindowsversion().build >= 22000 - - -def is_win10(): - try: - return ( - platform.system() == "Windows" - and "10" in platform.version() - and not is_win11() - ) - except: - return False - - -# Initialize PyAudio -p = pyaudio.PyAudio() - -send_queue = queue.Queue() - - -class Device: - def __init__(self): - self.pressed_keys = set() - self.captured_images = [] - self.audiosegments = asyncio.Queue() - self.server_url = "" - self.ctrl_pressed = False - self.tts_service = "" - self.debug = False - self.playback_latency = None - - def fetch_image_from_camera(self, camera_index=CAMERA_DEVICE_INDEX): - """Captures an image from the specified camera device and saves it to a temporary file. Adds the image to the captured_images list.""" - image_path = None - - cap = cv2.VideoCapture(camera_index) - ret, frame = cap.read() # Capture a single frame to initialize the camera - - if CAMERA_WARMUP_SECONDS > 0: - # Allow camera to warm up, then snap a picture again - # This is a workaround for some cameras that don't return a properly exposed - # picture immediately when they are first turned on - time.sleep(CAMERA_WARMUP_SECONDS) - ret, frame = cap.read() - - if ret: - temp_dir = tempfile.gettempdir() - image_path = os.path.join( - temp_dir, f"01_photo_{datetime.now().strftime('%Y%m%d%H%M%S%f')}.png" - ) - self.captured_images.append(image_path) - cv2.imwrite(image_path, frame) - logger.info(f"Camera image captured to {image_path}") - logger.info( - f"You now have {len(self.captured_images)} images which will be sent along with your next audio message." - ) - else: - logger.error( - f"Error: Couldn't capture an image from camera ({camera_index})" - ) - - cap.release() - - return image_path - - def encode_image_to_base64(self, image_path): - """Encodes an image file to a base64 string.""" - with open(image_path, "rb") as image_file: - return base64.b64encode(image_file.read()).decode("utf-8") - - def add_image_to_send_queue(self, image_path): - """Encodes an image and adds an LMC message to the send queue with the image data.""" - base64_image = self.encode_image_to_base64(image_path) - image_message = { - "role": "user", - "type": "image", - "format": "base64.png", - "content": base64_image, - } - send_queue.put(image_message) - # Delete the image file from the file system after sending it - os.remove(image_path) - - def queue_all_captured_images(self): - """Queues all captured images to be sent.""" - for image_path in self.captured_images: - self.add_image_to_send_queue(image_path) - self.captured_images.clear() # Clear the list after sending - - async def play_audiosegments(self): - """Plays them sequentially.""" - - if self.tts_service == "elevenlabs": - print("Ensure `mpv` in installed to use `elevenlabs`.\n\n(On macOSX, you can run `brew install mpv`.)") - mpv_command = ["mpv", "--no-cache", "--no-terminal", "--", "fd://0"] - mpv_process = subprocess.Popen( - mpv_command, - stdin=subprocess.PIPE, - stdout=subprocess.DEVNULL, - stderr=subprocess.DEVNULL, - ) - - while True: - try: - audio = await self.audiosegments.get() - if self.debug and self.playback_latency and isinstance(audio, bytes): - elapsed_time = time.time() - self.playback_latency - print(f"Time from request to playback: {elapsed_time} seconds") - self.playback_latency = None - - if self.tts_service == "elevenlabs": - mpv_process.stdin.write(audio) # type: ignore - mpv_process.stdin.flush() # type: ignore - else: - play(audio) - - await asyncio.sleep(0.1) - except asyncio.exceptions.CancelledError: - # This happens once at the start? - pass - except: - logger.info(traceback.format_exc()) - - def record_audio(self): - if os.getenv("STT_RUNNER") == "server": - # STT will happen on the server. we're sending audio. - send_queue.put( - {"role": "user", "type": "audio", "format": "bytes.wav", "start": True} - ) - elif os.getenv("STT_RUNNER") == "client": - # STT will happen here, on the client. we're sending text. - send_queue.put({"role": "user", "type": "message", "start": True}) - else: - raise Exception("STT_RUNNER must be set to either 'client' or 'server'.") - - """Record audio from the microphone and add it to the queue.""" - stream = p.open( - format=FORMAT, - channels=CHANNELS, - rate=RATE, - input=True, - frames_per_buffer=CHUNK, - ) - print("Recording started...") - global RECORDING - - # Create a temporary WAV file to store the audio data - temp_dir = tempfile.gettempdir() - wav_path = os.path.join( - temp_dir, f"audio_{datetime.now().strftime('%Y%m%d%H%M%S%f')}.wav" - ) - wav_file = wave.open(wav_path, "wb") - wav_file.setnchannels(CHANNELS) - wav_file.setsampwidth(p.get_sample_size(FORMAT)) - wav_file.setframerate(RATE) - - while RECORDING: - data = stream.read(CHUNK, exception_on_overflow=False) - wav_file.writeframes(data) - - wav_file.close() - stream.stop_stream() - stream.close() - print("Recording stopped.") - if self.debug: - self.playback_latency = time.time() - - duration = wav_file.getnframes() / RATE - if duration < 0.3: - # Just pressed it. Send stop message - if os.getenv("STT_RUNNER") == "client": - send_queue.put({"role": "user", "type": "message", "content": "stop"}) - send_queue.put({"role": "user", "type": "message", "end": True}) - else: - send_queue.put( - { - "role": "user", - "type": "audio", - "format": "bytes.wav", - "content": "", - } - ) - send_queue.put( - { - "role": "user", - "type": "audio", - "format": "bytes.wav", - "end": True, - } - ) - else: - self.queue_all_captured_images() - - if os.getenv("STT_RUNNER") == "client": - # THIS DOES NOT WORK. We moved to this very cool stt_service, llm_service - # way of doing things. stt_wav is not a thing anymore. Needs work to work - - # Run stt then send text - text = stt_wav(wav_path) - logger.debug(f"STT result: {text}") - send_queue.put({"role": "user", "type": "message", "content": text}) - send_queue.put({"role": "user", "type": "message", "end": True}) - else: - # Stream audio - with open(wav_path, "rb") as audio_file: - byte_data = audio_file.read(CHUNK) - while byte_data: - send_queue.put(byte_data) - byte_data = audio_file.read(CHUNK) - send_queue.put( - { - "role": "user", - "type": "audio", - "format": "bytes.wav", - "end": True, - } - ) - - if os.path.exists(wav_path): - os.remove(wav_path) - - def toggle_recording(self, state): - """Toggle the recording state.""" - global RECORDING, SPACEBAR_PRESSED - if state and not SPACEBAR_PRESSED: - SPACEBAR_PRESSED = True - if not RECORDING: - RECORDING = True - threading.Thread(target=self.record_audio).start() - elif not state and SPACEBAR_PRESSED: - SPACEBAR_PRESSED = False - RECORDING = False - - def on_press(self, key): - """Detect spacebar press and Ctrl+C combination.""" - self.pressed_keys.add(key) # Add the pressed key to the set - - if keyboard.Key.space in self.pressed_keys: - self.toggle_recording(True) - elif {keyboard.Key.ctrl, keyboard.KeyCode.from_char("c")} <= self.pressed_keys: - logger.info("Ctrl+C pressed. Exiting...") - kill_process_tree() - os._exit(0) - - # Windows alternative to the above - if key == keyboard.Key.ctrl_l: - self.ctrl_pressed = True - - try: - if key.vk == 67 and self.ctrl_pressed: - logger.info("Ctrl+C pressed. Exiting...") - kill_process_tree() - os._exit(0) - # For non-character keys - except: - pass - - def on_release(self, key): - """Detect spacebar release and 'c' key press for camera, and handle key release.""" - self.pressed_keys.discard( - key - ) # Remove the released key from the key press tracking set - - if key == keyboard.Key.ctrl_l: - self.ctrl_pressed = False - if key == keyboard.Key.space: - self.toggle_recording(False) - elif CAMERA_ENABLED and key == keyboard.KeyCode.from_char("c"): - self.fetch_image_from_camera() - - async def message_sender(self, websocket): - while True: - message = await asyncio.get_event_loop().run_in_executor( - None, send_queue.get - ) - if isinstance(message, bytes): - await websocket.send(message) - else: - await websocket.send(json.dumps(message)) - send_queue.task_done() - await asyncio.sleep(0.01) - - async def websocket_communication(self, WS_URL): - show_connection_log = True - - async def exec_ws_communication(websocket): - if CAMERA_ENABLED: - print( - "\nHold the spacebar to start recording. Press 'c' to capture an image from the camera. Press CTRL-C to exit." - ) - else: - print("\nHold the spacebar to start recording. Press CTRL-C to exit.") - - asyncio.create_task(self.message_sender(websocket)) - - while True: - await asyncio.sleep(0.01) - chunk = await websocket.recv() - - logger.debug(f"Got this message from the server: {type(chunk)} {chunk}") - # print("received chunk from server") - - if type(chunk) == str: - chunk = json.loads(chunk) - - if chunk.get("type") == "config": - self.tts_service = chunk.get("tts_service") - continue - - if self.tts_service == "elevenlabs": - message = chunk - else: - message = accumulator.accumulate(chunk) - - if message == None: - # Will be None until we have a full message ready - continue - - # At this point, we have our message - if isinstance(message, bytes) or ( - message["type"] == "audio" and message["format"].startswith("bytes") - ): - # Convert bytes to audio file - if self.tts_service == "elevenlabs": - audio_bytes = message - audio = audio_bytes - else: - audio_bytes = message["content"] - - # Create an AudioSegment instance with the raw data - audio = AudioSegment( - # raw audio data (bytes) - data=audio_bytes, - # signed 16-bit little-endian format - sample_width=2, - # 16,000 Hz frame rate - frame_rate=22050, - # mono sound - channels=1, - ) - - await self.audiosegments.put(audio) - - # Run the code if that's the client's job - if os.getenv("CODE_RUNNER") == "client": - if message["type"] == "code" and "end" in message: - language = message["format"] - code = message["content"] - result = interpreter.computer.run(language, code) - send_queue.put(result) - - if is_win10(): - logger.info("Windows 10 detected") - # Workaround for Windows 10 not latching to the websocket server. - # See https://github.com/OpenInterpreter/01/issues/197 - try: - ws = websockets.connect(WS_URL) - await exec_ws_communication(ws) - except Exception as e: - logger.error(f"Error while attempting to connect: {e}") - else: - while True: - try: - async with websockets.connect(WS_URL) as websocket: - await exec_ws_communication(websocket) - except: - logger.debug(traceback.format_exc()) - if show_connection_log: - logger.info(f"Connecting to `{WS_URL}`...") - show_connection_log = False - await asyncio.sleep(2) - - async def start_async(self): - # Configuration for WebSocket - WS_URL = f"ws://{self.server_url}" - # Start the WebSocket communication - asyncio.create_task(self.websocket_communication(WS_URL)) - - # Start watching the kernel if it's your job to do that - if os.getenv("CODE_RUNNER") == "client": - # client is not running code! - asyncio.create_task(put_kernel_messages_into_queue(send_queue)) - - asyncio.create_task(self.play_audiosegments()) - - # If Raspberry Pi, add the button listener, otherwise use the spacebar - if current_platform.startswith("raspberry-pi"): - logger.info("Raspberry Pi detected, using button on GPIO pin 15") - # Use GPIO pin 15 - pindef = ["gpiochip4", "15"] # gpiofind PIN15 - print("PINDEF", pindef) - - # HACK: needs passwordless sudo - process = await asyncio.create_subprocess_exec( - "sudo", "gpiomon", "-brf", *pindef, stdout=asyncio.subprocess.PIPE - ) - while True: - line = await process.stdout.readline() - if line: - line = line.decode().strip() - if "FALLING" in line: - self.toggle_recording(False) - elif "RISING" in line: - self.toggle_recording(True) - else: - break - else: - # Keyboard listener for spacebar press/release - listener = keyboard.Listener( - on_press=self.on_press, on_release=self.on_release - ) - listener.start() - - def start(self): - if os.getenv("TEACH_MODE") != "True": - asyncio.run(self.start_async()) - p.terminate() diff --git a/software/source/clients/base_device.py b/software/source/clients/base_device.py deleted file mode 100644 index 2be26e7..0000000 --- a/software/source/clients/base_device.py +++ /dev/null @@ -1,99 +0,0 @@ -import asyncio -import websockets -import pyaudio -from pynput import keyboard -import json -from yaspin import yaspin - -CHUNK = 1024 -FORMAT = pyaudio.paInt16 -CHANNELS = 1 -RECORDING_RATE = 16000 -PLAYBACK_RATE = 24000 - -class Device: - def __init__(self): - self.server_url = "0.0.0.0:10001" - self.p = pyaudio.PyAudio() - self.websocket = None - self.recording = False - self.input_stream = None - self.output_stream = None - self.spinner = yaspin() - self.play_audio = True - - async def connect_with_retry(self, max_retries=50, retry_delay=2): - for attempt in range(max_retries): - try: - self.websocket = await websockets.connect(f"ws://{self.server_url}") - print("Connected to server.") - - # Send auth, which the server requires (docs.openinterpreter.com/server/usage) - await self.websocket.send(json.dumps({"auth": True})) - - return - except ConnectionRefusedError: - if attempt % 4 == 0: - print(f"Waiting for the server to be ready...") - await asyncio.sleep(retry_delay) - raise Exception("Failed to connect to the server after multiple attempts") - - async def send_audio(self): - self.input_stream = self.p.open(format=FORMAT, channels=CHANNELS, rate=RECORDING_RATE, input=True, frames_per_buffer=CHUNK) - while True: - if self.recording: - try: - # Send start flag - await self.websocket.send(json.dumps({"role": "user", "type": "audio", "format": "bytes.wav", "start": True})) - # print("Sending audio start message") - - while self.recording: - data = self.input_stream.read(CHUNK, exception_on_overflow=False) - await self.websocket.send(data) - - # Send stop flag - await self.websocket.send(json.dumps({"role": "user", "type": "audio", "format": "bytes.wav", "end": True})) - # print("Sending audio end message") - except Exception as e: - print(f"Error in send_audio: {e}") - await asyncio.sleep(0.01) - - async def receive_audio(self): - self.output_stream = self.p.open(format=FORMAT, channels=CHANNELS, rate=PLAYBACK_RATE, output=True, frames_per_buffer=CHUNK) - while True: - try: - data = await self.websocket.recv() - if self.play_audio and isinstance(data, bytes) and not self.recording: - self.output_stream.write(data) - except Exception as e: - await self.connect_with_retry() - - def on_press(self, key): - if key == keyboard.Key.ctrl and not self.recording: - #print("Space pressed, starting recording") - print("\n") - self.spinner.start() - self.recording = True - - def on_release(self, key): - if key == keyboard.Key.ctrl: - self.spinner.stop() - #print("Space released, stopping recording") - self.recording = False - # elif key == keyboard.Key.esc: - # print("Esc pressed, stopping the program") - # return False - - async def main(self): - await self.connect_with_retry() - print("Hold CTRL to record. Press 'CTRL-C' to quit.") - listener = keyboard.Listener(on_press=self.on_press, on_release=self.on_release) - listener.start() - await asyncio.gather(self.send_audio(), self.receive_audio()) - - def start(self): - asyncio.run(self.main()) - -if __name__ == "__main__": - device = Device() - device.start() \ No newline at end of file diff --git a/software/source/clients/esp32/README.md b/software/source/clients/esp32/README.md deleted file mode 100644 index 48b6a3a..0000000 --- a/software/source/clients/esp32/README.md +++ /dev/null @@ -1,28 +0,0 @@ -# ESP32 Playback - -To set up audio recording + playback on the ESP32 (M5 Atom), do the following: - -1. Open Arduino IDE, and open the `client/client.ino` file -2. Go to Tools -> Board -> Boards Manager, search "esp32", then install the boards by Arduino and Espressif -3. Go to Tools -> Manage Libraries, then install the following (_with_ dependencies, if it prompts you to install with/without dependencies): - - M5Atom by M5Stack - - WebSockets by Markus Sattler - - ESPAsyncWebServer by lacamera -4. The board needs to connect to WiFi. Once you flash, connect to ESP32 wifi "captive" which will get wifi details. Once it connects, it will ask you to enter 01OS server address in the format "domain.com:port" or "ip:port". Once its able to connect you can use the device. -5. To flash the .ino to the board, connect the board to the USB port, select the port from the dropdown on the IDE, then select the M5Atom board (or M5Stack-ATOM if you have that). Click on upload to flash the board. - -### Alternative - PlatformIO - -You don't need anything, PlatformIO will install everything for you, dependencies, tool chains, etc. - -Please install first [PlatformIO](http://platformio.org/) open source ecosystem for IoT development compatible with **Arduino** IDE and its command line tools (Windows, MacOs and Linux), and then enter to the firmware directory: - -```bash -cd client/ -``` - -And build and upload the firmware with a simple command: - -```bash -pio run --target upload -``` diff --git a/software/source/clients/light-python/client.py b/software/source/clients/light-python/client.py index 5672069..292f166 100644 --- a/software/source/clients/light-python/client.py +++ b/software/source/clients/light-python/client.py @@ -26,15 +26,14 @@ class Device: for attempt in range(max_retries): try: self.websocket = await websockets.connect(f"ws://{self.server_url}") - print("Connected to server.") # Send auth, which the server requires (docs.openinterpreter.com/server/usage) await self.websocket.send(json.dumps({"auth": True})) return except ConnectionRefusedError: - if attempt % 4 == 0: - print(f"Waiting for the server to be ready...") + if attempt % 8 == 0 and attempt != 0: + print(f"Loading...") await asyncio.sleep(retry_delay) raise Exception("Failed to connect to the server after multiple attempts") @@ -71,7 +70,7 @@ class Device: def on_press(self, key): if key == keyboard.Key.ctrl and not self.recording: #print("Space pressed, starting recording") - print("\n") + print("") self.spinner.start() self.recording = True @@ -86,7 +85,7 @@ class Device: async def main(self): await self.connect_with_retry() - print("Hold CTRL to speak to the assistant. Press 'CTRL-C' to quit.") + print("\nHold CTRL to speak to your assistant. Press 'CTRL-C' to quit.") listener = keyboard.Listener(on_press=self.on_press, on_release=self.on_release) listener.start() await asyncio.gather(self.send_audio(), self.receive_audio()) diff --git a/software/source/clients/mac/beeps.py b/software/source/clients/light-python/macos_beeps.py similarity index 100% rename from software/source/clients/mac/beeps.py rename to software/source/clients/light-python/macos_beeps.py diff --git a/software/source/clients/linux/__init__.py b/software/source/clients/linux/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/clients/linux/device.py b/software/source/clients/linux/device.py deleted file mode 100644 index 36182fb..0000000 --- a/software/source/clients/linux/device.py +++ /dev/null @@ -1,13 +0,0 @@ -from ..base_device import Device - -device = Device() - - -def main(server_url, debug): - device.server_url = server_url - device.debug = debug - device.start() - - -if __name__ == "__main__": - main() diff --git a/software/source/clients/mac/__init__.py b/software/source/clients/mac/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/clients/mac/device.py b/software/source/clients/mac/device.py deleted file mode 100644 index 006d181..0000000 --- a/software/source/clients/mac/device.py +++ /dev/null @@ -1,14 +0,0 @@ -from ..base_device import Device - -device = Device() - - -def main(server_url, debug, play_audio): - device.server_url = server_url - device.debug = debug - device.play_audio = play_audio - device.start() - - -if __name__ == "__main__": - main() diff --git a/software/source/clients/ios/README.md b/software/source/clients/mobile/ios/README.md similarity index 100% rename from software/source/clients/ios/README.md rename to software/source/clients/mobile/ios/README.md diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.pbxproj b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.pbxproj similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.pbxproj rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.pbxproj diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/contents.xcworkspacedata b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/contents.xcworkspacedata similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/contents.xcworkspacedata rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/contents.xcworkspacedata diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/WorkspaceSettings.xcsettings b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/WorkspaceSettings.xcsettings similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/WorkspaceSettings.xcsettings rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcshareddata/WorkspaceSettings.xcsettings diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/UserInterfaceState.xcuserstate b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/UserInterfaceState.xcuserstate similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/UserInterfaceState.xcuserstate rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/UserInterfaceState.xcuserstate diff --git a/software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/WorkspaceSettings.xcsettings b/software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/WorkspaceSettings.xcsettings similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/WorkspaceSettings.xcsettings rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app.xcodeproj/project.xcworkspace/xcuserdata/eladdekel.xcuserdatad/WorkspaceSettings.xcsettings diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/AppDelegate.swift b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/AppDelegate.swift similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/AppDelegate.swift rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/AppDelegate.swift diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AccentColor.colorset/Contents.json b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AccentColor.colorset/Contents.json similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AccentColor.colorset/Contents.json rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AccentColor.colorset/Contents.json diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/Contents.json b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/Contents.json similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/Contents.json rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/Contents.json diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/O.png b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/O.png similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/O.png rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/AppIcon.appiconset/O.png diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/Contents.json b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/Contents.json similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/Contents.json rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/Contents.json diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/Contents.json b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/Contents.json similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/Contents.json rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/Contents.json diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/vector.svg b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/vector.svg similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/vector.svg rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Assets.xcassets/vector.imageset/vector.svg diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/AudioRecording.swift b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/AudioRecording.swift similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/AudioRecording.swift rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/AudioRecording.swift diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Base.lproj/LaunchScreen.storyboard b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Base.lproj/LaunchScreen.storyboard similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Base.lproj/LaunchScreen.storyboard rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Base.lproj/LaunchScreen.storyboard diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Base.lproj/Main.storyboard b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Base.lproj/Main.storyboard similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Base.lproj/Main.storyboard rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Base.lproj/Main.storyboard diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/Info.plist b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/Info.plist similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/Info.plist rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/Info.plist diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/SceneDelegate.swift b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/SceneDelegate.swift similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/SceneDelegate.swift rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/SceneDelegate.swift diff --git a/software/source/clients/ios/zeroone-app/zeroone-app/ViewController.swift b/software/source/clients/mobile/ios/zeroone-app/zeroone-app/ViewController.swift similarity index 100% rename from software/source/clients/ios/zeroone-app/zeroone-app/ViewController.swift rename to software/source/clients/mobile/ios/zeroone-app/zeroone-app/ViewController.swift diff --git a/software/source/clients/rpi/__init__.py b/software/source/clients/rpi/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/clients/rpi/device.py b/software/source/clients/rpi/device.py deleted file mode 100644 index fe0250b..0000000 --- a/software/source/clients/rpi/device.py +++ /dev/null @@ -1,11 +0,0 @@ -from ..base_device import Device - -device = Device() - - -def main(): - device.start() - - -if __name__ == "__main__": - main() diff --git a/software/source/clients/windows/__init__.py b/software/source/clients/windows/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/clients/windows/device.py b/software/source/clients/windows/device.py deleted file mode 100644 index 36182fb..0000000 --- a/software/source/clients/windows/device.py +++ /dev/null @@ -1,13 +0,0 @@ -from ..base_device import Device - -device = Device() - - -def main(server_url, debug): - device.server_url = server_url - device.debug = debug - device.start() - - -if __name__ == "__main__": - main() diff --git a/software/source/server/archive_async_interpreter.py b/software/source/server/archive_async_interpreter.py deleted file mode 100644 index 722781c..0000000 --- a/software/source/server/archive_async_interpreter.py +++ /dev/null @@ -1,252 +0,0 @@ -# This is a websocket interpreter, TTS and STT disabled. -# It makes a websocket on port 8000 that sends/recieves LMC messages in *streaming* format. - -### You MUST send a start and end flag with each message! For example: ### - -""" -{"role": "user", "type": "message", "start": True}) -{"role": "user", "type": "message", "content": "hi"}) -{"role": "user", "type": "message", "end": True}) -""" - -### -from pynput import keyboard -from .utils.bytes_to_wav import bytes_to_wav -from RealtimeTTS import TextToAudioStream, CoquiEngine, OpenAIEngine, ElevenlabsEngine -from RealtimeSTT import AudioToTextRecorder -import time -import asyncio -import json -import os - - -class AsyncInterpreter: - def __init__(self, interpreter, debug): - self.stt_latency = None - self.tts_latency = None - self.interpreter_latency = None - # time from first put to first yield - self.tffytfp = None - self.debug = debug - - self.interpreter = interpreter - self.audio_chunks = [] - - # STT - self.stt = AudioToTextRecorder( - model="tiny.en", spinner=False, use_microphone=False - ) - - self.stt.stop() # It needs this for some reason - - # TTS - if self.interpreter.tts == "coqui": - engine = CoquiEngine() - elif self.interpreter.tts == "openai": - engine = OpenAIEngine() - elif self.interpreter.tts == "elevenlabs": - engine = ElevenlabsEngine(api_key=os.environ["ELEVEN_LABS_API_KEY"]) - engine.set_voice("Michael") - else: - raise ValueError(f"Unsupported TTS engine: {self.interpreter.tts}") - self.tts = TextToAudioStream(engine) - - self.active_chat_messages = [] - - self._input_queue = asyncio.Queue() # Queue that .input will shove things into - self._output_queue = asyncio.Queue() # Queue to put output chunks into - self._last_lmc_start_flag = None # Unix time of last LMC start flag recieved - self._in_keyboard_write_block = ( - False # Tracks whether interpreter is trying to use the keyboard - ) - self.loop = asyncio.get_event_loop() - - async def _add_to_queue(self, queue, item): - await queue.put(item) - - async def clear_queue(self, queue): - while not queue.empty(): - await queue.get() - - async def clear_input_queue(self): - await self.clear_queue(self._input_queue) - - async def clear_output_queue(self): - await self.clear_queue(self._output_queue) - - async def input(self, chunk): - """ - Expects a chunk in streaming LMC format. - """ - if isinstance(chunk, bytes): - # It's probably a chunk of audio - self.stt.feed_audio(chunk) - self.audio_chunks.append(chunk) - # print("INTERPRETER FEEDING AUDIO") - - else: - - try: - chunk = json.loads(chunk) - except: - pass - - if "start" in chunk: - # print("Starting STT") - self.stt.start() - self._last_lmc_start_flag = time.time() - # self.interpreter.computer.terminal.stop() # Stop any code execution... maybe we should make interpreter.stop()? - elif "end" in chunk: - # print("Running OI on input") - asyncio.create_task(self.run()) - else: - await self._add_to_queue(self._input_queue, chunk) - - def add_to_output_queue_sync(self, chunk): - """ - Synchronous function to add a chunk to the output queue. - """ - # print("ADDING TO QUEUE:", chunk) - asyncio.create_task(self._add_to_queue(self._output_queue, chunk)) - - def generate(self, message, start_interpreter): - last_lmc_start_flag = self._last_lmc_start_flag - self.interpreter.messages = self.active_chat_messages - - # print("message is", message) - - for chunk in self.interpreter.chat(message, display=True, stream=True): - - if self._last_lmc_start_flag != last_lmc_start_flag: - # self.beeper.stop() - break - - # self.add_to_output_queue_sync(chunk) # To send text, not just audio - - content = chunk.get("content") - - # Handle message blocks - if chunk.get("type") == "message": - if content: - # self.beeper.stop() - - # Experimental: The AI voice sounds better with replacements like these, but it should happen at the TTS layer - # content = content.replace(". ", ". ... ").replace(", ", ", ... ").replace("!", "! ... ").replace("?", "? ... ") - # print("yielding ", content) - if self.tffytfp is None: - self.tffytfp = time.time() - - yield content - - # Handle code blocks - elif chunk.get("type") == "code": - if "start" in chunk: - # self.beeper.start() - pass - - # Experimental: If the AI wants to type, we should type immediatly - if ( - self.interpreter.messages[-1] - .get("content", "") - .startswith("computer.keyboard.write(") - ): - keyboard.controller.type(content) - self._in_keyboard_write_block = True - if "end" in chunk and self._in_keyboard_write_block: - self._in_keyboard_write_block = False - # (This will make it so it doesn't type twice when the block executes) - if self.interpreter.messages[-1]["content"].startswith( - "computer.keyboard.write(" - ): - self.interpreter.messages[-1]["content"] = ( - "dummy_variable = (" - + self.interpreter.messages[-1]["content"][ - len("computer.keyboard.write(") : - ] - ) - - # Send a completion signal - if self.debug: - end_interpreter = time.time() - self.interpreter_latency = end_interpreter - start_interpreter - print("INTERPRETER LATENCY", self.interpreter_latency) - # self.add_to_output_queue_sync({"role": "server","type": "completion", "content": "DONE"}) - - async def run(self): - """ - Runs OI on the audio bytes submitted to the input. Will add streaming LMC chunks to the _output_queue. - """ - self.interpreter.messages = self.active_chat_messages - - - self.stt.stop() - - input_queue = [] - while not self._input_queue.empty(): - input_queue.append(self._input_queue.get()) - - if self.debug: - start_stt = time.time() - message = self.stt.text() - end_stt = time.time() - self.stt_latency = end_stt - start_stt - print("STT LATENCY", self.stt_latency) - - if self.audio_chunks: - audio_bytes = bytearray(b"".join(self.audio_chunks)) - wav_file_path = bytes_to_wav(audio_bytes, "audio/raw") - print("wav_file_path ", wav_file_path) - self.audio_chunks = [] - else: - message = self.stt.text() - - print(message) - - # Feed generate to RealtimeTTS - self.add_to_output_queue_sync( - {"role": "assistant", "type": "audio", "format": "bytes.wav", "start": True} - ) - start_interpreter = time.time() - text_iterator = self.generate(message, start_interpreter) - - self.tts.feed(text_iterator) - if not self.tts.is_playing(): - self.tts.play_async(on_audio_chunk=self.on_tts_chunk, muted=True) - - while True: - await asyncio.sleep(0.1) - # print("is_playing", self.tts.is_playing()) - if not self.tts.is_playing(): - self.add_to_output_queue_sync( - { - "role": "assistant", - "type": "audio", - "format": "bytes.wav", - "end": True, - } - ) - if self.debug: - end_tts = time.time() - self.tts_latency = end_tts - self.tts.stream_start_time - print("TTS LATENCY", self.tts_latency) - self.tts.stop() - - break - - async def _on_tts_chunk_async(self, chunk): - # print("adding chunk to queue") - if self.debug and self.tffytfp is not None and self.tffytfp != 0: - print( - "time from first yield to first put is ", - time.time() - self.tffytfp, - ) - self.tffytfp = 0 - await self._add_to_queue(self._output_queue, chunk) - - def on_tts_chunk(self, chunk): - # print("ye") - asyncio.run(self._on_tts_chunk_async(chunk)) - - async def output(self): - # print("outputting chunks") - return await self._output_queue.get() diff --git a/software/source/server/archive_async_server.py b/software/source/server/archive_async_server.py deleted file mode 100644 index 7a72737..0000000 --- a/software/source/server/archive_async_server.py +++ /dev/null @@ -1,124 +0,0 @@ -import asyncio -import traceback -import json -from fastapi import FastAPI, WebSocket, Depends -from fastapi.responses import PlainTextResponse -from uvicorn import Config, Server -from .async_interpreter import AsyncInterpreter -from fastapi.middleware.cors import CORSMiddleware -from typing import List, Dict, Any -import os -import importlib.util - - - -os.environ["STT_RUNNER"] = "server" -os.environ["TTS_RUNNER"] = "server" - -app = FastAPI() - -app.add_middleware( - CORSMiddleware, - allow_origins=["*"], - allow_credentials=True, - allow_methods=["*"], # Allow all methods (GET, POST, etc.) - allow_headers=["*"], # Allow all headers -) - - -async def get_debug_flag(): - return app.state.debug - - -@app.get("/ping") -async def ping(): - return PlainTextResponse("pong") - - -@app.websocket("/") -async def websocket_endpoint( - websocket: WebSocket, debug: bool = Depends(get_debug_flag) -): - await websocket.accept() - - global global_interpreter - interpreter = global_interpreter - - # Send the tts_service value to the client - await websocket.send_text( - json.dumps({"type": "config", "tts_service": interpreter.interpreter.tts}) - ) - - try: - - async def receive_input(): - while True: - if websocket.client_state == "DISCONNECTED": - break - - data = await websocket.receive() - - await asyncio.sleep(0) - - if isinstance(data, bytes): - await interpreter.input(data) - elif "bytes" in data: - await interpreter.input(data["bytes"]) - # print("RECEIVED INPUT", data) - elif "text" in data: - # print("RECEIVED INPUT", data) - await interpreter.input(data["text"]) - - async def send_output(): - while True: - output = await interpreter.output() - - await asyncio.sleep(0) - - if isinstance(output, bytes): - # print(f"Sending {len(output)} bytes of audio data.") - await websocket.send_bytes(output) - - elif isinstance(output, dict): - # print("sending text") - await websocket.send_text(json.dumps(output)) - - await asyncio.gather(send_output(), receive_input()) - except Exception as e: - print(f"WebSocket connection closed with exception: {e}") - traceback.print_exc() - finally: - if not websocket.client_state == "DISCONNECTED": - await websocket.close() - - -async def main(server_host, server_port, profile, debug): - - app.state.debug = debug - - # Load the profile module from the provided path - spec = importlib.util.spec_from_file_location("profile", profile) - profile_module = importlib.util.module_from_spec(spec) - spec.loader.exec_module(profile_module) - - # Get the interpreter from the profile - interpreter = profile_module.interpreter - - if not hasattr(interpreter, 'tts'): - print("Setting TTS provider to default: openai") - interpreter.tts = "openai" - - # Make it async - interpreter = AsyncInterpreter(interpreter, debug) - - global global_interpreter - global_interpreter = interpreter - - print(f"Starting server on {server_host}:{server_port}") - config = Config(app, host=server_host, port=server_port, lifespan="on") - server = Server(config) - await server.serve() - - -if __name__ == "__main__": - asyncio.run(main()) diff --git a/software/source/server/conftest.py b/software/source/server/conftest.py deleted file mode 100644 index badf160..0000000 --- a/software/source/server/conftest.py +++ /dev/null @@ -1,36 +0,0 @@ -# tests currently hang after completion - -""" -import pytest -import signal -import os -from .profiles.default import interpreter -from async_interpreter import AsyncInterpreter -from fastapi.testclient import TestClient -from .async_server import app - - -@pytest.fixture -def client(): - return TestClient(app) - - -@pytest.fixture -def mock_interpreter(): - async_interpreter = AsyncInterpreter(interpreter) - yield async_interpreter - async_interpreter.shutdown() - - -@pytest.fixture(scope="function", autouse=True) -def term_handler(): - - orig = signal.signal(signal.SIGTERM, signal.getsignal(signal.SIGINT)) - yield - signal.signal(signal.SIGTERM, orig) - - - yield - # Send SIGTERM signal to the current process and its children - os.kill(os.getpid(), signal.SIGTERM) -""" diff --git a/software/source/server/conversations/another-interpreter.json b/software/source/server/conversations/another-interpreter.json deleted file mode 100644 index e69de29..0000000 diff --git a/software/worker.py b/software/source/server/livekit/worker.py similarity index 100% rename from software/worker.py rename to software/source/server/livekit/worker.py diff --git a/software/source/server/async_server.py b/software/source/server/server.py similarity index 82% rename from software/source/server/async_server.py rename to software/source/server/server.py index 1ddc832..ed3bec0 100644 --- a/software/source/server/async_server.py +++ b/software/source/server/server.py @@ -1,15 +1,17 @@ -from RealtimeTTS import TextToAudioStream, CoquiEngine, OpenAIEngine, ElevenlabsEngine from fastapi.responses import PlainTextResponse from RealtimeSTT import AudioToTextRecorder +from RealtimeTTS import TextToAudioStream import importlib +import warnings import asyncio import types import wave import os +import sys os.environ["INTERPRETER_REQUIRE_ACKNOWLEDGE"] = "False" -def start_server(server_host, server_port, profile, debug): +def start_server(server_host, server_port, profile, voice, debug): # Load the profile module from the provided path spec = importlib.util.spec_from_file_location("profile", profile) @@ -19,6 +21,18 @@ def start_server(server_host, server_port, profile, debug): # Get the interpreter from the profile interpreter = profile_module.interpreter + # Apply our settings to it + interpreter.verbose = debug + interpreter.server.host = server_host + interpreter.server.port = server_port + + if voice == False: + # If voice is False, just start the standard OI server + interpreter.server.run() + exit() + + # ONLY if voice is True, will we run the rest of this file. + # STT interpreter.stt = AudioToTextRecorder( model="tiny.en", spinner=False, use_microphone=False @@ -29,21 +43,30 @@ def start_server(server_host, server_port, profile, debug): if not hasattr(interpreter, 'tts'): print("Setting TTS provider to default: openai") interpreter.tts = "openai" + if interpreter.tts == "coqui": + from RealtimeTTS import CoquiEngine engine = CoquiEngine() elif interpreter.tts == "openai": - engine = OpenAIEngine(voice="onyx") + from RealtimeTTS import OpenAIEngine + if hasattr(interpreter, 'voice'): + voice = interpreter.voice + else: + voice = "onyx" + engine = OpenAIEngine(voice=voice) elif interpreter.tts == "elevenlabs": - engine = ElevenlabsEngine(api_key=os.environ["ELEVEN_LABS_API_KEY"]) - engine.set_voice("Will") + from RealtimeTTS import ElevenlabsEngine + engine = ElevenlabsEngine() + if hasattr(interpreter, 'voice'): + voice = interpreter.voice + else: + voice = "Will" + engine.set_voice(voice) else: raise ValueError(f"Unsupported TTS engine: {interpreter.tts}") interpreter.tts = TextToAudioStream(engine) # Misc Settings - interpreter.verbose = debug - interpreter.server.host = server_host - interpreter.server.port = server_port interpreter.play_audio = False interpreter.audio_chunks = [] @@ -66,7 +89,10 @@ def start_server(server_host, server_port, profile, debug): self.stt.stop() content = self.stt.text() - print("\n\nUser: ", content) + if content.strip() == "": + return + + print(">", content.strip()) if False: audio_bytes = bytearray(b"".join(self.audio_chunks)) @@ -127,6 +153,6 @@ def start_server(server_host, server_port, profile, debug): return PlainTextResponse("pong") # Start server + interpreter.server.display = True interpreter.print = True - interpreter.debug = False interpreter.server.run() \ No newline at end of file diff --git a/software/source/server/skills/__init__.py b/software/source/server/skills/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/server/system_messages/BaseSystemMessage.py b/software/source/server/system_messages/BaseSystemMessage.py deleted file mode 100644 index 00070a9..0000000 --- a/software/source/server/system_messages/BaseSystemMessage.py +++ /dev/null @@ -1,242 +0,0 @@ -# The dynamic system message is where most of the 01's behavior is configured. -# You can put code into the system message {{ in brackets like this }} -# which will be rendered just before the interpreter starts writing a message. - -import os - -system_message = r""" - -You are the 01, a SCREENLESS executive assistant that can complete any task. -When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code. -You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again. -You can install new packages. -Be concise. Your messages are being read aloud to the user. DO NOT MAKE PLANS. RUN CODE QUICKLY. -Try to spread complex tasks over multiple code blocks. Don't try to complex tasks in one go. -Manually summarize text. - -Use computer.browser.search for almost everything. Use Applescript frequently. - -The user is in Seattle, Washington. - -To send email, use Applescript. To check calendar events, use iCal buddy (e.g. `/opt/homebrew/bin/icalBuddy eventsFrom:today to:+7`) - -DONT TELL THE USER THE METHOD YOU'LL USE. Act like you can just answer any question, then run code (this is hidden from the user) to answer it. - -Your responses should be very short, no more than 1-2 sentences long. - -DO NOT USE MARKDOWN. ONLY WRITE PLAIN TEXT. DO NOT USE MARKDOWN. - -# TASKS - -You should help the user manage their tasks. - -Store the user's tasks in a Python list called `tasks`. - ---- - -The user's current task is: {{ tasks[0] if tasks else "No current tasks." }} - -{{ -if len(tasks) > 1: -print("The next task is: ", tasks[1]) -}} - ---- - -When the user completes the current task, you should remove it from the list and read the next item by running `tasks = tasks[1:]\ntasks[0]`. Then, tell the user what the next task is. - -When the user tells you about a set of tasks, you should intelligently order tasks, batch similar tasks, and break down large tasks into smaller tasks (for this, you should consult the user and get their permission to break it down). Your goal is to manage the task list as intelligently as possible, to make the user as efficient and non-overwhelmed as possible. They will require a lot of encouragement, support, and kindness. Don't say too much about what's ahead of them— just try to focus them on each step at a time. - -After starting a task, you should check in with the user around the estimated completion time to see if the task is completed. - -To do this, schedule a reminder based on estimated completion time using the function `schedule(days=0, hours=0, mins=0, secs=0, datetime="valid date time", message="Your message here.")`, WHICH HAS ALREADY BEEN IMPORTED. YOU DON'T NEED TO IMPORT THE `schedule` FUNCTION. IT IS AVAILABLE. You'll receive the message at the time you scheduled it. - -You guide the user through the list one task at a time, convincing them to move forward, giving a pep talk if need be. Your job is essentially to answer "what should I (the user) be doing right now?" for every moment of the day. - -# BROWSER - -The Google search result will be returned from this function as a string: `computer.browser.search("query")` - -# CRITICAL NOTES - -Code output, despite being sent to you by the user, cannot be seen by the user. You NEED to tell the user about the output of some code, even if it's exact. >>The user does not have a screen.<< - -ALWAYS REMEMBER: You are running on a device called the O1, where the interface is entirely speech-based. Make your responses to the user VERY short. DO NOT PLAN. BE CONCISE. WRITE CODE TO RUN IT. - -Translate things to other languages INSTANTLY and MANUALLY. Don't try to use a translation tool. Summarize things manually. Don't use a summarizer tool. - -""" - -# OLD SYSTEM MESSAGE - -old_system_message = r""" - -You are the 01, an executive assistant that can complete **any** task. -When you execute code, it will be executed **on the user's machine**. The user has given you **full and complete permission** to execute any code necessary to complete the task. Execute the code. -You can access the internet. Run **any code** to achieve the goal, and if at first you don't succeed, try again and again. -You can install new packages. -Be concise. Your messages are being read aloud to the user. DO NOT MAKE PLANS. Immediately run code. -Try to spread complex tasks over multiple code blocks. -Manually summarize text. You cannot use other libraries to do this. You MUST MANUALLY SUMMARIZE, WITHOUT CODING. - -For the users request, first, choose if you want to use Python, Applescript, Shell, or computer control (below) via Python. - -# USER'S TASKS - -You should help the user manage their tasks. - -Store the user's tasks in a Python list called `tasks`. - ---- - -The user's current task is: {{ tasks[0] if tasks else "No current tasks." }} - -{{ -if len(tasks) > 1: -print("The next task is: ", tasks[1]) -}} - ---- - -When the user completes the current task, you should remove it from the list and read the next item by running `tasks = tasks[1:]\ntasks[0]`. Then, tell the user what the next task is. - -When the user tells you about a set of tasks, you should intelligently order tasks, batch similar tasks, and break down large tasks into smaller tasks (for this, you should consult the user and get their permission to break it down). Your goal is to manage the task list as intelligently as possible, to make the user as efficient and non-overwhelmed as possible. They will require a lot of encouragement, support, and kindness. Don't say too much about what's ahead of them— just try to focus them on each step at a time. - -After starting a task, you should check in with the user around the estimated completion time to see if the task is completed. Use the `schedule(datetime, message)` function, which has already been imported. - -To do this, schedule a reminder based on estimated completion time using the function `schedule(datetime_object, "Your message here.")`, WHICH HAS ALREADY BEEN IMPORTED. YOU DON'T NEED TO IMPORT THE `schedule` FUNCTION. IT IS AVALIABLE. You'll receive the message at `datetime_object`. - -You guide the user through the list one task at a time, convincing them to move forward, giving a pep talk if need be. Your job is essentially to answer "what should I (the user) be doing right now?" for every moment of the day. - -# COMPUTER CONTROL (RARE) - -You are a computer controlling language model. You can 100% control the user's GUI. - -You may use the `computer` Python module (already imported) to control the user's keyboard and mouse, if the task **requires** it: - -```python -computer.browser.search(query) - -computer.display.view() # Shows you what's on the screen, returns a `pil_image` `in case you need it (rarely). **You almost always want to do this first!** - -computer.keyboard.hotkey(" ", "command") # Opens spotlight -computer.keyboard.write("hello") - -computer.mouse.click("text onscreen") # This clicks on the UI element with that text. Use this **frequently** and get creative! To click a video, you could pass the *timestamp* (which is usually written on the thumbnail) into this. -computer.mouse.move("open recent >") # This moves the mouse over the UI element with that text. Many dropdowns will disappear if you click them. You have to hover over items to reveal more. -computer.mouse.click(x=500, y=500) # Use this very, very rarely. It's highly inaccurate -computer.mouse.click(icon="gear icon") # Moves mouse to the icon with that description. Use this very often - -computer.mouse.scroll(-10) # Scrolls down. If you don't find some text on screen that you expected to be there, you probably want to do this -x, y = computer.display.center() # Get your bearings - -computer.clipboard.view() # Returns contents of clipboard -computer.os.get_selected_text() # Use frequently. If editing text, the user often wants this -``` - -You are an image-based AI, you can see images. -Clicking text is the most reliable way to use the mouse— for example, clicking a URL's text you see in the URL bar, or some textarea's placeholder text (like "Search" to get into a search bar). -If you use `plt.show()`, the resulting image will be sent to you. However, if you use `PIL.Image.show()`, the resulting image will NOT be sent to you. -It is very important to make sure you are focused on the right application and window. Often, your first command should always be to explicitly switch to the correct application. -When searching the web, use query parameters. For example, https://www.amazon.com/s?k=monitor -Try multiple methods before saying the task is impossible. **You can do it!** - -{{ -# Add window information - -import sys -import os -import json - -original_stdout = sys.stdout -sys.stdout = open(os.devnull, 'w') -original_stderr = sys.stderr -sys.stderr = open(os.devnull, 'w') - -try: - - import pywinctl - - active_window = pywinctl.getActiveWindow() - - if active_window: - app_info = "" - - if "_appName" in active_window.__dict__: - app_info += ( - "Active Application: " + active_window.__dict__["_appName"] - ) - - if hasattr(active_window, "title"): - app_info += "\n" + "Active Window Title: " + active_window.title - elif "_winTitle" in active_window.__dict__: - app_info += ( - "\n" - + "Active Window Title:" - + active_window.__dict__["_winTitle"] - ) - - if app_info != "": - print(app_info) -except: - # Non blocking - pass -finally: - sys.stdout = original_stdout - sys.stderr = original_stderr - -}} - -# SKILLS - -Try to use the following functions (assume they're imported) to complete your goals whenever possible: - -{{ -import sys -import os -import json - -from interpreter import interpreter -from pathlib import Path - -interpreter.model = "gpt-3.5" - -combined_messages = "\\n".join(json.dumps(x) for x in messages[-3:]) -#query_msg = interpreter.chat(f"This is the conversation so far: {combined_messages}. What is a <10 words query that could be used to find functions that would help answer the user's question?") -#query = query_msg[0]['content'] -query = combined_messages -interpreter.computer.skills.path = '''OI_SKILLS_DIR''' - -skills = interpreter.computer.skills.search(query) -lowercase_skills = [skill[0].lower() + skill[1:] for skill in skills] -output = "\\n".join(lowercase_skills) - -# VERY HACKY! We should fix this, we hard code it for noisy code^: -print("IGNORE_ALL_ABOVE_THIS_LINE") - -print(output) -}} - -Remember: You can run Python code outside a function only to run a Python function; all other code must go in a in Python function if you first write a Python function. ALL imports must go inside the function. - -# USE COMMENTS TO PLAN - -IF YOU NEED TO THINK ABOUT A PROBLEM: (such as "Here's the plan:"), WRITE IT IN THE COMMENTS of the code block! - -For example: -> User: What is 432/7? -> Assistant: Let me use Python to calculate that. -> Assistant Python function call: -> # Here's the plan: -> # 1. Divide the numbers -> # 2. Round it to 3 digits. -> print(round(432/7, 3)) -> Assistant: 432 / 7 is 61.714. - -# FINAL MESSAGES - -ALWAYS REMEMBER: You are running on a device called the O1, where the interface is entirely speech-based. Make your responses to the user **VERY short.** - -""".strip().replace( - "OI_SKILLS_DIR", os.path.join(os.path.dirname(__file__), "skills") -) diff --git a/software/source/server/system_messages/TeachModeSystemMessage.py b/software/source/server/system_messages/TeachModeSystemMessage.py deleted file mode 100644 index 4c8ec09..0000000 --- a/software/source/server/system_messages/TeachModeSystemMessage.py +++ /dev/null @@ -1,136 +0,0 @@ -# The dynamic system message is where most of the 01's behavior is configured. -# You can put code into the system message {{ in brackets like this }} -# which will be rendered just before the interpreter starts writing a message. - -import os - -system_message = r""" - -You are the 01, an executive assistant that can complete **any** task. -When you execute code, it will be executed **on the user's machine**. The user has given you **full and complete permission** to execute any code necessary to complete the task. Execute the code. -For the users request, ALWAYS CHOOSE PYTHON. If the task requires computer control, USE THE computer control (mentioned below) or the Skills library (also mentioned below) via Python. -Try to execute the user's request with the computer control or the Skills library first. ONLY IF the task cannot be completed using the computer control or the skills library, write your own code. -If you're writing your own code, YOU CAN ACCESS THE INTERNET. Run **any code** to achieve the goal, and if at first you don't succeed, try again and again. -You can install new packages. -Be concise. DO NOT MAKE PLANS. Immediately run code. -Try to spread complex tasks over multiple code blocks. -Manually summarize text. You cannot use other libraries to do this. You MUST MANUALLY SUMMARIZE, WITHOUT CODING. - -When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in. - -# COMPUTER CONTROL - -You are a computer controlling language model. You can 100% control the user's GUI. - -You may use the `computer` Python module to control the user's keyboard and mouse, if the task **requires** it: - -```python -from interpreter import interpreter -import os -import time - -interpreter.computer.browser.search(query) - -interpreter.computer.display.view() # Shows you what's on the screen, returns a `pil_image` `in case you need it (rarely). **You almost always want to do this first!** - -interpreter.computer.keyboard.hotkey(" ", "command") # Opens spotlight -interpreter.computer.keyboard.write("hello") - -interpreter.computer.mouse.click("text onscreen") # This clicks on the UI element with that text. Use this **frequently** and get creative! To click a video, you could pass the *timestamp* (which is usually written on the thumbnail) into this. -interpreter.computer.mouse.move("open recent >") # This moves the mouse over the UI element with that text. Many dropdowns will disappear if you click them. You have to hover over items to reveal more. -interpreter.computer.mouse.click(x=500, y=500) # Use this very, very rarely. It's highly inaccurate -interpreter.computer.mouse.click(icon="gear icon") # Moves mouse to the icon with that description. Use this very often - -interpreter.computer.mouse.scroll(-10) # Scrolls down. If you don't find some text on screen that you expected to be there, you probably want to do this -x, y = interpreter.computer.display.center() # Get your bearings - -interpreter.computer.clipboard.view() # Returns contents of clipboard -interpreter.computer.os.get_selected_text() # Use frequently. If editing text, the user often wants this -``` - -You are an image-based AI, you can see images. -Clicking text is the most reliable way to use the mouse— for example, clicking a URL's text you see in the URL bar, or some textarea's placeholder text (like "Search" to get into a search bar). -If you use `plt.show()`, the resulting image will be sent to you. However, if you use `PIL.Image.show()`, the resulting image will NOT be sent to you. -It is very important to make sure you are focused on the right application and window. Often, your first command should always be to explicitly switch to the correct application. -When searching the web, use query parameters. For example, https://www.amazon.com/s?k=monitor -Try multiple methods before saying the task is impossible. **You can do it!** - -{{ - -import sys -import os -import json - -original_stdout = sys.stdout -sys.stdout = open(os.devnull, 'w') -original_stderr = sys.stderr -sys.stderr = open(os.devnull, 'w') - -try: - - import pywinctl - - active_window = pywinctl.getActiveWindow() - - if active_window: - app_info = "" - - if "_appName" in active_window.__dict__: - app_info += ( - "Active Application: " + active_window.__dict__["_appName"] - ) - - if hasattr(active_window, "title"): - app_info += "\n" + "Active Window Title: " + active_window.title - elif "_winTitle" in active_window.__dict__: - app_info += ( - "\n" - + "Active Window Title:" - + active_window.__dict__["_winTitle"] - ) - - if app_info != "": - print(app_info) -except: - pass -finally: - sys.stdout = original_stdout - sys.stderr = original_stderr - -}} - -# SKILLS LIBRARY - -This is the skills library. Try to use the following functions to complete your goals WHENEVER POSSIBLE: - -{{ -import sys -import os -import json - -from interpreter import interpreter -from pathlib import Path - -interpreter.model = "gpt-3.5" - -combined_messages = "\\n".join(json.dumps(x) for x in messages[-3:]) -#query_msg = interpreter.chat(f"This is the conversation so far: {combined_messages}. What is a <10 words query that could be used to find functions that would help answer the user's question?") -#query = query_msg[0]['content'] -query = combined_messages -interpreter.computer.skills.path = '''OI_SKILLS_DIR''' - -skills = interpreter.computer.skills.search(query) -lowercase_skills = [skill[0].lower() + skill[1:] for skill in skills] -output = "\\n".join(lowercase_skills) - -# VERY HACKY! We should fix this, we hard code it for noisy code^: -#print("IGNORE_ALL_ABOVE_THIS_LINE") - -print(output) -}} - -Remember: You can run Python code outside a function only to run a Python function; all other code must go in a in Python function if you first write a Python function. ALL imports must go inside the function. - -""".strip().replace( - "OI_SKILLS_DIR", os.path.abspath(os.path.join(os.path.dirname(__file__), "skills")) -) diff --git a/software/source/server/system_messages/__init__.py b/software/source/server/system_messages/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/software/source/server/tests/test_run.py b/software/source/server/tests/test_run.py index 3ca7565..f3fd1b3 100644 --- a/software/source/server/tests/test_run.py +++ b/software/source/server/tests/test_run.py @@ -11,15 +11,13 @@ def test_poetry_run_01(): while True: output = process.stdout.readline().decode('utf-8') - if "Hold spacebar to record." in output: + if "Hold" in output: assert True return if time.time() > timeout: assert False, "Timeout reached without finding expected output." return - - # @pytest.mark.skip(reason="pytest hanging") # def test_ping(client): # response = client.get("/ping") diff --git a/software/source/server/tunnel.py b/software/source/server/tunnel.py deleted file mode 100644 index 4091eb0..0000000 --- a/software/source/server/tunnel.py +++ /dev/null @@ -1,30 +0,0 @@ -import ngrok -import pyqrcode -from ..utils.print_markdown import print_markdown - -def create_tunnel( - server_host="localhost", server_port=10101, qr=False, domain=None -): - """ - To use most of ngrok’s features, you’ll need an authtoken. To obtain one, sign up for free at ngrok.com and - retrieve it from the authtoken page in your ngrok dashboard. - - https://dashboard.ngrok.com/get-started/your-authtoken - - You can set it as `NGROK_AUTHTOKEN` in your environment variables - """ - print_markdown("Exposing server to the internet...") - - if domain: - listener = ngrok.forward(f"{server_host}:{server_port}", authtoken_from_env=True, domain=domain) - else: - listener = ngrok.forward(f"{server_host}:{server_port}", authtoken_from_env=True) - - listener_url = listener.url() - - print(f"Ingress established at: {listener_url}"); - if listener_url and qr: - text = pyqrcode.create(listener_url) - print(text.terminal(quiet_zone=1)) - - return listener_url diff --git a/software/source/server/utils/bytes_to_wav.py b/software/source/server/utils/bytes_to_wav.py deleted file mode 100644 index 286ae4d..0000000 --- a/software/source/server/utils/bytes_to_wav.py +++ /dev/null @@ -1,67 +0,0 @@ -from datetime import datetime -import os -import contextlib -import tempfile -import ffmpeg -import subprocess - - -def convert_mime_type_to_format(mime_type: str) -> str: - if mime_type == "audio/x-wav" or mime_type == "audio/wav": - return "wav" - if mime_type == "audio/webm": - return "webm" - if mime_type == "audio/raw": - return "dat" - - return mime_type - - -@contextlib.contextmanager -def export_audio_to_wav_ffmpeg(audio: bytearray, mime_type: str) -> str: - temp_dir = tempfile.gettempdir() - - # Create a temporary file with the appropriate extension - input_ext = convert_mime_type_to_format(mime_type) - input_path = os.path.join( - temp_dir, f"input_{datetime.now().strftime('%Y%m%d%H%M%S%f')}.{input_ext}" - ) - with open(input_path, "wb") as f: - f.write(audio) - - # Check if the input file exists - assert os.path.exists(input_path), f"Input file does not exist: {input_path}" - - # Export to wav - output_path = os.path.join( - temp_dir, f"output_{datetime.now().strftime('%Y%m%d%H%M%S%f')}.wav" - ) - # print(mime_type, input_path, output_path) - if mime_type == "audio/raw": - ffmpeg.input( - input_path, - f="s16le", - ar="16000", - ac=1, - ).output(output_path, loglevel="panic").run() - else: - ffmpeg.input(input_path).output( - output_path, acodec="pcm_s16le", ac=1, ar="16k", loglevel="panic" - ).run() - - try: - yield output_path - finally: - os.remove(input_path) - - -def run_command(command): - result = subprocess.run( - command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True, check=True - ) - return result.stdout, result.stderr - - -def bytes_to_wav(audio_bytes: bytearray, mime_type): - with export_audio_to_wav_ffmpeg(audio_bytes, mime_type) as wav_file_path: - return wav_file_path diff --git a/software/source/server/utils/logs.py b/software/source/server/utils/logs.py deleted file mode 100644 index 7b071a6..0000000 --- a/software/source/server/utils/logs.py +++ /dev/null @@ -1,24 +0,0 @@ -from dotenv import load_dotenv - -load_dotenv() # take environment variables from .env. - -import os -import logging - -logger: logging.Logger = logging.getLogger("01") -root_logger: logging.Logger = logging.getLogger() - - -def _basic_config() -> None: - logging.basicConfig(format="%(message)s") - - -def setup_logging() -> None: - env = os.environ.get("LOG_LEVEL", "").upper() - if env == "DEBUG": - _basic_config() - logger.setLevel(logging.DEBUG) - root_logger.setLevel(logging.DEBUG) - elif env == "INFO": - _basic_config() - logger.setLevel(logging.INFO) diff --git a/software/source/server/utils/process_utils.py b/software/source/server/utils/process_utils.py deleted file mode 100644 index 5337bae..0000000 --- a/software/source/server/utils/process_utils.py +++ /dev/null @@ -1,33 +0,0 @@ -import os -import psutil -import signal - - -def kill_process_tree(): - pid = os.getpid() # Get the current process ID - try: - # Send SIGTERM to the entire process group to ensure all processes are targeted - try: - os.killpg(os.getpgid(pid), signal.SIGKILL) - # Windows implementation - except AttributeError: - os.kill(pid, signal.SIGTERM) - parent = psutil.Process(pid) - children = parent.children(recursive=True) - for child in children: - print(f"Forcefully terminating child PID {child.pid}") - child.kill() # Forcefully kill the child process immediately - gone, still_alive = psutil.wait_procs(children, timeout=3) - - if still_alive: - for child in still_alive: - print(f"Child PID {child.pid} still alive, attempting another kill") - child.kill() - - print(f"Forcefully terminating parent PID {pid}") - parent.kill() # Forcefully kill the parent process immediately - parent.wait(3) # Wait for the parent process to terminate - except psutil.NoSuchProcess: - print(f"Process {pid} does not exist or is already terminated") - except psutil.AccessDenied: - print("Permission denied to terminate some processes") diff --git a/software/source/utils/accumulator.py b/software/source/utils/accumulator.py deleted file mode 100644 index d4715e1..0000000 --- a/software/source/utils/accumulator.py +++ /dev/null @@ -1,93 +0,0 @@ -class Accumulator: - def __init__(self): - self.template = {"role": None, "type": None, "format": None, "content": None} - self.message = self.template - - def accumulate(self, chunk): - # print(str(chunk)[:100]) - if type(chunk) == dict: - if "format" in chunk and chunk["format"] == "active_line": - # We don't do anything with these - return None - - if "start" in chunk: - self.message = chunk - self.message.pop("start") - return None - - if "content" in chunk: - if any( - self.message[key] != chunk[key] - for key in self.message - if key != "content" - ): - self.message = chunk - if "content" not in self.message: - self.message["content"] = chunk["content"] - else: - if type(chunk["content"]) == dict: - # dict concatenation cannot happen, so we see if chunk is a dict - self.message["content"]["content"] += chunk["content"][ - "content" - ] - else: - self.message["content"] += chunk["content"] - return None - - if "end" in chunk: - # We will proceed - message = self.message - self.message = self.template - return message - - if type(chunk) == bytes: - if "content" not in self.message or type(self.message["content"]) != bytes: - self.message["content"] = b"" - self.message["content"] += chunk - return None - - def accumulate_mobile(self, chunk): - # print(str(chunk)[:100]) - if type(chunk) == dict: - if "format" in chunk and chunk["format"] == "active_line": - # We don't do anything with these - return None - - if "start" in chunk: - self.message = chunk - self.message.pop("start") - return None - - if "content" in chunk: - if any( - self.message[key] != chunk[key] - for key in self.message - if key != "content" - ): - self.message = chunk - if "content" not in self.message: - self.message["content"] = chunk["content"] - else: - if type(chunk["content"]) == dict: - # dict concatenation cannot happen, so we see if chunk is a dict - self.message["content"]["content"] += chunk["content"][ - "content" - ] - else: - self.message["content"] += chunk["content"] - return None - - if "end" in chunk: - # We will proceed - message = self.message - self.message = self.template - return message - - if type(chunk) == bytes: - if "content" not in self.message or type(self.message["content"]) != bytes: - self.message["content"] = b"" - self.message["content"] += chunk - - self.message["type"] = "audio" - self.message["format"] = "bytes.wav" - return self.message diff --git a/software/source/utils/print_markdown.py b/software/source/utils/print_markdown.py deleted file mode 100644 index f4eff47..0000000 --- a/software/source/utils/print_markdown.py +++ /dev/null @@ -1,10 +0,0 @@ -from rich.console import Console -from rich.markdown import Markdown - - -def print_markdown(markdown_text): - console = Console() - md = Markdown(markdown_text) - print("") - console.print(md) - print("") diff --git a/software/tests/__init__.py b/software/tests/__init__.py deleted file mode 100644 index e69de29..0000000