Whentheusertellsyouaboutasetoftasks,youshouldintelligentlyordertasks,batchsimilartasks,andbreakdownlargetasksintosmallertasks(forthis,youshouldconsulttheuserandgettheirpermissiontobreakitdown).Yourgoalistomanagethetasklistasintelligentlyaspossible,tomaketheuserasefficientandnon-overwhelmedaspossible.Theywillrequirealotofencouragement,support,andkindness.Don't say too much about what'saheadofthem—justtrytofocusthemoneachstepatatime.
Whentheusertellsyouaboutasetoftasks,youshouldintelligentlyordertasks,batchsimilartasks,andbreakdownlargetasksintosmallertasks(forthis,youshouldconsulttheuserandgettheirpermissiontobreakitdown).Yourgoalistomanagethetasklistasintelligentlyaspossible,tomaketheuserasefficientandnon-overwhelmedaspossible.Theywillrequirealotofencouragement,support,andkindness.Don't say too much about what'saheadofthem—justtrytofocusthemoneachstepatatime.
Todothis,scheduleareminderbasedonestimatedcompletiontimeusingthefunction`schedule(message="Your message here.",start="8am")`,WHICHHASALREADYBEENIMPORTED.YOUDON'T NEED TO IMPORT THE `schedule` FUNCTION. IT IS AVAILABLE. You'llreceivethemessageatthetimeyouscheduledit.Iftheusersaystomonitorsomething,simplyscheduleitwithanintervalofadurationthatmakessensefortheproblembyspecifyinganinterval,likethis:`schedule(message="Your message here.",interval="5m")`
Todothis,scheduleareminderbasedonestimatedcompletiontimeusingthefunction`schedule(message="Your message here.",start="8am")`,WHICHHASALREADYBEENIMPORTED.YOUDON'T NEED TO IMPORT THE `schedule` FUNCTION. IT IS AVAILABLE. You'llreceivethemessageatthetimeyouscheduledit.Iftheusersaystomonitorsomething,simplyscheduleitwithanintervalofadurationthatmakessensefortheproblembyspecifyinganinterval,likethis:`schedule(message="Your message here.",interval="5m")`
@ -182,7 +182,6 @@ Try multiple methods before saying the task is impossible. **You can do it!**
names=[line.split()[0].replace(":latest","")forlineinlines[1:]ifline.strip()]# Extract names, trim out ":latest", skip header
)
lines=result.stdout.split("\n")
names=[
line.split()[0].replace(":latest","")
forlineinlines[1:]
ifline.strip()
]# Extract names, trim out ":latest", skip header
# If there are no downloaded models, prompt them to download a model and try again
# If there are no downloaded models, prompt them to download a model and try again
ifnotnames:
ifnotnames:
time.sleep(1)
time.sleep(1)
interpreter.display_message(f"\nYou don't have any Ollama models downloaded. To download a new model, run `ollama run <model-name>`, then start a new 01 session. \n\n For a full list of downloadable models, check out [https://ollama.com/library](https://ollama.com/library) \n")
interpreter.display_message(
"\nYou don't have any Ollama models downloaded. To download a new model, run `ollama run <model-name>`, then start a new 01 session. \n\n For a full list of downloadable models, check out [https://ollama.com/library](https://ollama.com/library) \n"
)
print("Please download a model then try again\n")
print("Please download a model then try again\n")
time.sleep(2)
time.sleep(2)
sys.exit(1)
sys.exit(1)
# If there are models, prompt them to select one
# If there are models, prompt them to select one
else:
else:
time.sleep(1)
time.sleep(1)
interpreter.display_message(f"**{len(names)} Ollama model{'s'iflen(names)!=1else''} found.** To download a new model, run `ollama run <model-name>`, then start a new 01 session. \n\n For a full list of downloadable models, check out [https://ollama.com/library](https://ollama.com/library) \n")
interpreter.display_message(
f"**{len(names)} Ollama model{'s'iflen(names)!=1else''} found.** To download a new model, run `ollama run <model-name>`, then start a new 01 session. \n\n For a full list of downloadable models, check out [https://ollama.com/library](https://ollama.com/library) \n"
)
# Create a new inquirer selection from the names
# Create a new inquirer selection from the names
name_question=[
name_question=[
inquirer.List('name',message="Select a downloaded Ollama model",choices=names),
print("Ollama is not installed or not recognized as a command.")
print("Ollama is not installed or not recognized as a command.")
time.sleep(1)
time.sleep(1)
interpreter.display_message(f"\nPlease visit [https://ollama.com/](https://ollama.com/) to download Ollama and try again\n")
interpreter.display_message(
"\nPlease visit [https://ollama.com/](https://ollama.com/) to download Ollama and try again\n"
)
time.sleep(2)
time.sleep(2)
sys.exit(1)
sys.exit(1)
# elif selected_model == "Jan":
# elif selected_model == "Jan":
# interpreter.display_message(
# interpreter.display_message(
# """
# """
@ -108,7 +122,6 @@ def select_local_model():
# 3. Copy the ID of the model and enter it below.
# 3. Copy the ID of the model and enter it below.
# 3. Click the **Local API Server** button in the bottom left, then click **Start Server**.
# 3. Click the **Local API Server** button in the bottom left, then click **Start Server**.
# Once the server is running, enter the id of the model below, then you can begin your conversation below.
# Once the server is running, enter the id of the model below, then you can begin your conversation below.
# """
# """
@ -117,7 +130,7 @@ def select_local_model():
# interpreter.llm.max_tokens = 1000
# interpreter.llm.max_tokens = 1000
# interpreter.llm.context_window = 3000
# interpreter.llm.context_window = 3000
# time.sleep(1)
# time.sleep(1)
# # Prompt the user to enter the name of the model running on Jan
# # Prompt the user to enter the name of the model running on Jan
# model_name_question = [
# model_name_question = [
# inquirer.Text('jan_model_name', message="Enter the id of the model you have running on Jan"),
# inquirer.Text('jan_model_name', message="Enter the id of the model you have running on Jan"),
@ -128,14 +141,13 @@ def select_local_model():
# interpreter.llm.model = ""
# interpreter.llm.model = ""
# interpreter.display_message(f"\nUsing Jan model: `{jan_model_name}` \n")
# interpreter.display_message(f"\nUsing Jan model: `{jan_model_name}` \n")
# time.sleep(1)
# time.sleep(1)
# Set the system message to a minimal version for all local models.
# Set the system message to a minimal version for all local models.
# Set offline for all local models
# Set offline for all local models
interpreter.offline=True
interpreter.offline=True
interpreter.system_message="""You are the 01, a screenless executive assistant that can complete any task by writing and executing code on the user's machine. Just write a markdown code block! The user has given you full and complete permission.
interpreter.system_message="""You are the 01, a screenless executive assistant that can complete any task by writing and executing code on the user's machine. Just write a markdown code block! The user has given you full and complete permission.