This is a script analysis and execution system based on Streamlit and Ollama. The system can analyze the text input by the user, determine the appropriate generator type, generate the corresponding script name, and execute the matching script. The entire process is presented through a user-friendly web interface.
- Text Input: Users can input the text that needs to be analyzed.
- Automatic Analysis: The system uses the Ollama model to analyze the input and determine the appropriate generator type.
- Script Selection: Based on the analysis results, the system generates the corresponding script name.
- Script Execution: If a matching script is found, the system will execute the script and return the result.
- Result Display: The execution result is directly displayed on the web interface.
- Python 3.10+
- Streamlit
- Langchain
- Ollama
-
Clone the project repository:
git clone git@github.com:gszhangwei/task_assistant.git cd task_assistant
-
Create and activate a virtual environment (optional):
python3 -m venv venv source venv/bin/activate # On Windows use venv\Scripts\activate
-
Install dependencies:
pip3 install -r requirements.txt
-
Install and set up Ollama:
- Visit Ollama Official Website to download and install Ollama
- Ensure Ollama service is running by executing
ollama run llama3.1
-
Configure the project:
- Check and update the configuration information in the
config.py
file - Ensure the paths in
GENERATOR_CONFIG
are set correctly
- Check and update the configuration information in the
Execute the following command in the project root directory:
streamlit run src/app.py
Then visit the displayed URL in your browser (usually http://localhost:8501
).
- Enter the content you want to analyze in the text box.
- Click the "Analyze and Execute" button.
- The system will automatically analyze your input and execute the corresponding script.
- The execution result will be displayed on the page.
project/
│
├── src/
│ ├── advanced_scripts
│ ├── expert_scripts
│ ├── python_scripts
│ ├── app.py # Main Streamlit application file
│ ├── execute_script_with_ollama.py # Core logic implementation
│ ├── prompt_generator_factory.py # Prompt generator factory
│ └── config.py # Configuration file
│
├── requirements.txt # Project dependencies
└── README.md # Project documentation
Feel free to submit issues and pull requests. This is a basic version, and there is much room for improvement. We hope everyone can provide feedback and help improve it together.