H Lohaus 4be8e69ea2 Merge pull request #2396 from hlohaus/open | 3 dienas atpakaļ | |
---|---|---|
.github | 1 nedēļu atpakaļ | |
docker | 1 nedēļu atpakaļ | |
docs | 1 nedēļu atpakaļ | |
etc | 4 dienas atpakaļ | |
g4f | 3 dienas atpakaļ | |
generated_images | 6 mēneši atpakaļ | |
har_and_cookies | 7 mēneši atpakaļ | |
models | 7 mēneši atpakaļ | |
projects | 6 mēneši atpakaļ | |
.gitattributes | 1 gadu atpakaļ | |
.gitignore | 6 mēneši atpakaļ | |
.gitpod.yml | 1 gadu atpakaļ | |
CODE_OF_CONDUCT.md | 1 gadu atpakaļ | |
CONTRIBUTING.md | 6 mēneši atpakaļ | |
LEGAL_NOTICE.md | 1 mēnesi atpakaļ | |
LICENSE | 1 gadu atpakaļ | |
MANIFEST.in | 7 mēneši atpakaļ | |
README.md | 6 dienas atpakaļ | |
SECURITY.md | 1 gadu atpakaļ | |
docker-compose-slim.yml | 1 nedēļu atpakaļ | |
docker-compose.yml | 1 mēnesi atpakaļ | |
requirements-min.txt | 1 nedēļu atpakaļ | |
requirements-slim.txt | 1 nedēļu atpakaļ | |
requirements.txt | 1 nedēļu atpakaļ | |
setup.py | 1 nedēļu atpakaļ |
Written by @xtekky
[!IMPORTANT] By using this repository or any code related to it, you agree to the legal notice. The author is not responsible for the usage of this repository nor endorses it, nor is the author responsible for any copies, forks, re-uploads made by other users, or anything else related to GPT4Free. This is the author's only account and repository. To prevent impersonation or irresponsible actions, please comply with the GNU GPL license this Repository uses.
[!WARNING] "gpt4free" serves as a PoC (proof of concept), demonstrating the development of an API package with multi-provider requests, with features like timeouts, load balance and flow control.
pip install -U g4f[all]
docker pull hlohaus789/g4f
Is your site on this repository and you want to take it down? Send an email to takedown@g4f.ai with proof it is yours and it will be removed as fast as possible. To prevent reproduction please secure your API. 😉
You can always leave some feedback here: https://forms.gle/FeWV9RLEedfdkmFN6
As per the survey, here is a list of improvements to come
Openai()
class) | completed, use g4f.client.Client
Install Docker: Begin by downloading and installing Docker.
Set Up the Container: Use the following commands to pull the latest image and start the container:
docker pull hlohaus789/g4f
docker run \
-p 8080:8080 -p 1337:1337 -p 7900:7900 \
--shm-size="2g" \
-v ${PWD}/har_and_cookies:/app/har_and_cookies \
-v ${PWD}/generated_images:/app/generated_images \
hlohaus789/g4f:latest
Or run this command to start the gui without a browser and in the debug mode:
docker pull hlohaus789/g4f:latest-slim
docker run \
-p 8080:8080 \
-v ${PWD}/har_and_cookies:/app/har_and_cookies \
-v ${PWD}/generated_images:/app/generated_images \
hlohaus789/g4f:latest-slim \
python -m g4f.cli gui -debug
Access the Client:
(Optional) Provider Login: If required, you can access the container's desktop here: http://localhost:7900/?autoconnect=1&resize=scale&password=secret for provider login purposes.
To ensure the seamless operation of our application, please follow the instructions below. These steps are designed to guide you through the installation process on Windows operating systems.
g4f.exe.zip
..zip
file in your Downloads folder. Unpack it to a directory of your choice on your system, then execute the g4f.exe
file to run the app.http://localhost:8080/chat/
to access the application interface.By following these steps, you should be able to successfully install and run the application on your Windows system. If you encounter any issues during the installation process, please refer to our Issue Tracker or try to get contact over Discord for assistance.
Run the Webview UI on other Platforms:
Run the Web UI on Your Smartphone:
pip install -U g4f[all]
How do I install only parts or do disable parts? Use partial requirements: /docs/requirements
How do I load the project using git and installing the project requirements? Read this tutorial and follow it step by step: /docs/git
How do I build and run composer image from source? Use docker-compose: /docs/docker
from g4f.client import Client
client = Client()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello"}],
# Add any other necessary parameters
)
print(response.choices[0].message.content)
Hello! How can I assist you today?
from g4f.client import Client
client = Client()
response = client.images.generate(
model="flux",
prompt="a white siamese cat",
# Add any other necessary parameters
)
image_url = response.data[0].url
print(f"Generated image URL: {image_url}")
New:
Legacy:
To start the web interface, type the following codes in python:
from g4f.gui import run_gui
run_gui()
or execute the following command:
python -m g4f.cli gui -port 8080 -debug
You can use the Interference API to serve other OpenAI integrations with G4F. See docs: /docs/interference Access with: http://localhost:1337/v1
Cookies are essential for using Meta AI and Microsoft Designer to create images. Additionally, cookies are required for the Google Gemini and WhiteRabbitNeo Provider. From Bing, ensure you have the "_U" cookie, and from Google, all cookies starting with "__Secure-1PSID" are needed.
You can pass these cookies directly to the create function or set them using the set_cookies
method before running G4F:
from g4f.cookies import set_cookies
set_cookies(".bing.com", {
"_U": "cookie value"
})
set_cookies(".google.com", {
"__Secure-1PSID": "cookie value"
})
You can place .har
and cookie files in the default ./har_and_cookies
directory. To export a cookie file, use the EditThisCookie Extension available on the Chrome Web Store.
To capture cookies, you can also create .har
files. For more details, refer to the next section.
You can change the cookies directory and load cookie files in your Python environment. To set the cookies directory relative to your Python file, use the following code:
import os.path
from g4f.cookies import set_cookies_dir, read_cookie_files
import g4f.debug
g4f.debug.logging = True
cookies_dir = os.path.join(os.path.dirname(__file__), "har_and_cookies")
set_cookies_dir(cookies_dir)
read_cookie_files(cookies_dir)
If you enable debug mode, you will see logs similar to the following:
Read .har file: ./har_and_cookies/you.com.har
Cookies added: 10 from .you.com
Read cookie file: ./har_and_cookies/google.json
Cookies added: 16 from .google.com
To utilize the OpenaiChat provider, a .har file is required from https://chatgpt.com/. Follow the steps below to create a valid .har file:
./har_and_cookies
directory if you are using Docker. Alternatively, if you are using Python from a terminal, you can store it in a ./har_and_cookies
directory within your current working directory.Note: Ensure that your .har file is stored securely, as it may contain sensitive information.
If you want to hide or change your IP address for the providers, you can set a proxy globally via an environment variable:
- On macOS and Linux:
export G4F_PROXY="http://host:port"
- On Windows:
set G4F_PROXY=http://host:port
🎁 Projects | ⭐ Stars | 📚 Forks | 🛎 Issues | 📬 Pull requests |
gpt4free | gpt4free-ts | |||
Free AI API's & Potential Providers List | ||||
ChatGPT-Clone | ||||
Ai agent | ||||
ChatGpt Discord Bot | ||||
chatGPT-discord-bot | ||||
Nyx-Bot (Discord) | ||||
LangChain gpt4free | ||||
ChatGpt Telegram Bot | ||||
ChatGpt Line Bot | ||||
Action Translate Readme | ||||
Langchain Document GPT | ||||
python-tgpt | ||||
GPT4js | ||||
VividNode (pyqt-openai) |
This project is licensed under GNU_GPL_v3.0. |