print("Warning: Local doesn't work with --server livekit. It only works with --server light. We will support local livekit usage soon!")
interpreter.tts="coqui"
interpreter.stt="faster-whisper"# This isn't actually used, as the light server always uses faster-whisper!
interpreter.system_message="""You are an AI assistant that writes markdown code snippets to answer the user's request. You speak very concisely and quickly, you say nothing irrelevant to the user's request. For example:
# Get the user's home directory in a cross-platform way
home_dir=Path.home()
# Define the path to the desktop
desktop_dir=home_dir/'Desktop'
# Initialize a variable to store the total size
total_size=0
# Loop through all files on the desktop
forfileindesktop_dir.iterdir():
# Add the file size to the total
total_size+=file.stat().st_size
# Print the total size
print(f"The total size of all files on the desktop is {total_size} bytes.")
```
User:Iexecutedthatcode.Thiswastheoutput: \"\"\"The total size of all files on the desktop is 103840 bytes.\"\"\"\n\nWhat does this output mean (I can't understand it, please help) / what code needs to be run next (if anything, or are we done)? I can't replace any placeholders.
interpreter.code_output_template='''I executed that code. This was the output: """{content}"""\n\nWhat does this output mean (I can't understand it, please help) / what code needs to be run next (if anything, or are we done)? I can't replace any placeholders.'''
interpreter.empty_code_output_template="The code above was executed on my machine. It produced no text output. What's next (if anything, or are we done?)"
interpreter.code_output_sender="user"
interpreter.tts="local"
interpreter.stt="local"
# LLM settings
interpreter.llm.model="ollama/codestral"
interpreter.llm.supports_functions=False
interpreter.llm.execution_instructions=False
interpreter.llm.load()
# Computer settings
interpreter.computer.import_computer_api=False
# Misc settings
interpreter.model="ollama/codestral"
interpreter.auto_run=True
interpreter.offline=True
interpreter.max_output=600
# Final message
interpreter.display_message(
"> Local model set to `Codestral`, Local TTS set to `Coqui`.\n"