diff --git a/chatgpt-telegram-bot/en/v.0.0.1/404.html b/chatgpt-telegram-bot/en/v.0.0.1/404.html new file mode 100644 index 0000000..6837cf9 --- /dev/null +++ b/chatgpt-telegram-bot/en/v.0.0.1/404.html @@ -0,0 +1,364 @@ + + + +
+ + + + + + + + + + + + + + + + + +++Introducing the ultimate AI-powered chatbot for Telegram - the perfect companion for anyone looking for quick, accurate answers to their questions. Our bot uses state-of-the-art machine learning technology powered by the ChatGPT model, allowing it to understand natural language inputs and respond with human-like precision.
+With our bot, you can ask anything you want, from simple queries to complex questions. Whether you're looking for information on the latest news, need help with a math problem, or just want to chat with a virtual friend, our bot has you covered.
+What's more, our bot is designed to be easy to use and user-friendly. Simply type in your question or query, and the bot will do the rest, providing you with a quick and accurate response in seconds. No more scrolling through endless search results or struggling to find the information you need - our bot does all the hard work for you.
+So why wait? Try out our ChatGPT-powered Telegram bot today and experience the future of AI-powered chatbots for yourself!
+(с) Generated by ChatGPT
+
But seriously, the project uses your ChatGPT token to access the ChatGPT API and let you chat with ChatGPT directly in Telegram.
+See for yourself how easy and convenient it is to use ChatGPT in Telegram. It is better to see once than to hear 100 times.
+ + +/clear_context
- Clears the conversation context. In fact, it deletes the chat and creates a new one.
/prompt your_question
- Allows you to ask a question outside the context of the main conversation.
See my last name in the domain? I'm a developer, blogger, and publicly active. I don't have the benefit of getting dirty under my own name. I assure you that your data is not transferred to third parties, even I do not know about your requests to ChatGPT. You can see for yourself by examining the code, it's opensource.
+Enough words, let's launch your personal ChatGPT Telegram bot. 🚀
+Support
+The following platforms are supported: linux/amd64,linux/arm64/v8
+docker run -it --name chatgpt-telegram-bot \
+ --env TELEGRAM_BOT_TOKEN= \
+ --env TELEGRAM_BOT_USERNAME= \
+ --env TELEGRAM_PERSON_ID= \
+ --env CHAT_GPT_TOKEN= \
+ upagge/chatgpt-telegram-bot:develop
+
TELEGRAM_BOT_USERNAME
- Specify a name with the ending bot here, not a public name.
If you have Telegram blocked, you can specify proxy settings to connect.
+docker run -it --name chatgpt-telegram-bot \
+ --env TELEGRAM_BOT_TOKEN= \
+ --env TELEGRAM_BOT_USERNAME= \
+ --env TELEGRAM_PERSON_ID= \
+ --env CHAT_GPT_TOKEN= \
+ --env TELEGRAM_PROXY_ENABLE=true \
+ --env TELEGRAM_PROXY_HOST= \
+ --env TELEGRAM_PROXY_PORT= \
+ --env TELEGRAM_PROXY_TYPE=SOCKS5 \
+ --env TELEGRAM_PROXY_USERNAME= \
+ --env TELEGRAM_PROXY_PASSWORD= \
+ upagge/chatgpt-telegram-bot:latest
+
SOCKS5
, SOCKS4
, HTTP
.Don't forget to create a file with the variable .env
.
version: '3.8'
+services:
+
+ chat-gpt:
+ image: upagge/chatgpt-telegram-bot:latest
+ container_name: chatgpt-bot
+ restart: always
+ environment:
+ TELEGRAM_BOT_TOKEN: ${TELEGRAM_BOT_TOKEN}
+ TELEGRAM_BOT_USERNAME: ${TELEGRAM_BOT_USERNAME}
+ TELEGRAM_PERSON_ID: ${TELEGRAM_PERSON_ID}
+ CHAT_GPT_TOKEN: ${CHAT_GPT_TOKEN}
+
Sponsorship makes a project sustainable because it pays for the time of the maintainers of that project, a very scarce resource that is spent on developing new features, fixing bugs, improving stability, solving problems, and general support. The biggest bottleneck in Open Source is time.
+struchkov-mark.ton
bc1pt49vnp43c4mktk6309zlq3020dzd0p89gc8d90zzn4sgjvck56xs0t86vy
0x7668C802Bd71Be965671D4Bbb1AD90C7f7f32921
0x7668C802Bd71Be965671D4Bbb1AD90C7f7f32921
0x7668C802Bd71Be965671D4Bbb1AD90C7f7f32921
0xDa41aC95f606850f2E01ba775e521Cd385AA7D03
0xDa41aC95f606850f2E01ba775e521Cd385AA7D03
0xDa41aC95f606850f2E01ba775e521Cd385AA7D03
For now the gpt-3.5-turbo
model is used. In future versions you will be able to choose the model.
If you don't understand something, you can ask a question here or in a discussion on GitHub.
+ + + + + + + + + + + + + +