The issue of processing or using AI involves high electricity consumption, which has been acknowledged for some time. However, in conversing with AI chatbots, the additional conversation input, while making everything appear more polite and courteous, is also seen as energy-consuming.
This concern arose from user X @tomieinlove’s post on OpenAI having to pay for electricity based on the user’s inquiry about how much it costs for phrases like “please” or “thank you,” which are processed further. Sam Altman, the CEO of OpenAI, responded to this post, stating, “Including now, it’s in the tens of millions of dollars. You might not know that.”
Typing these messages makes the conversation look polite as if talking to a person, but the additional input adds to the processing costs, especially multiplied by the number of people using ChatGPT per day, as Altman mentioned.
A survey of Americans found that 67% said they talk to AI chatbots politely like this, which may be the correct approach as multiple studies have shown that if we converse with AI chatbots rudely for extended periods, the results deteriorate gradually. This may pose the next challenge for developers on how to maintain polite conversations without increasing processing costs.
Source: Digital Trend
TLDR: Politeness in AI chatbot conversations may increase processing costs but is crucial for maintaining conversation quality and efficiency.
Leave a Comment