I was chatting with a pioneer in AI negotiation last week who said “The great thing about AI negotiation is it removes all the emotion from the discussion.” But does it?
I think it removes the dread, the fear, the negative emotions—but our experience of negotiation is anything but emotionless.
Is this controversial? What do you think?
At Nibble, we believe a great negotiation defines a relationship, builds trust, and develops understanding. Five years ago, when my co-founder Jamie Ettedgui was negotiating in the Istanbul markets, he came away not only with a price that worked for both buyer and seller but also with a big smile on his face. This was the inspiration for our company!
Is this cultural? The AI negotiator who wanted to remove emotions was from the US. Meanwhile, here in the UK, many of my friends would rather overpay than ask for a discount and risk haggling! But we have clients in the Middle East, India, and Brazil, and when I asked one of them, she said: “But Rosie, if I didn’t negotiate, how would I understand what my customer really wanted?”
My video tip this week talked about translating the emotional gives in a negotiation into your strategy, effectively putting a value on them. But outside the human-human field of negotiation how can you harness emotions for good use in the AI world?
Designing AI with Empathy
In a world where AI agents are becoming the norm, the challenge isn’t just about implementation, it’s about making it effective. Too many chatbots feel like talking to a brick wall, frustrating users rather than helping them. But a well-designed AI can interact with a thousand customers at once while making each one feel heard. The key? Empathy.
Setting the Right Expectations
One friend of mine proudly refuses to engage with any chatbots at all and always starts with: “Human, please.” But I think this will diminish as conversation design improves. Often, frustration stems from poor framing. AI interactions should be clear from the outset, guiding users on what to expect. Instead of vague prompts like “Ask me anything,” structure the conversation with clear options or suggested actions.
The golden rule: Never pretend to be human. Trust is hard to gain and easy to lose. Be upfront that users are speaking with AI, and set expectations accordingly.
Keeping It Conversational
Human conversations flow naturally—short turns, one question at a time, and clear responses. AI should replicate this rhythm. When dealing with important topics, slowing down the pacing (e.g., using message delays) can mimic human emphasis, making information easier to process.
For example, when a customer is negotiating, a well-placed pause before revealing a final offer makes the response feel more considered. Similarly, vague responses like “That works” should be avoided. Instead, use explicit confirmations to ensure clarity.
Handling Misunderstandings
Misunderstandings are inevitable, but they don’t have to derail the conversation. Instead of repeating the same misunderstood response with the deeply annoying “Sorry, I didn’t understand,” AI should reframe the message or clarify what it can do. If it fails multiple times, escalate to a person quickly or use generative AI to refine responses.
Which brings me to…
The Role of Generative AI
Generative AI offers massive potential for AI agents, but it comes with risks. An AI trained on a company’s knowledge base can provide 24/7 customer support, but without guardrails, it can generate misleading or even damaging responses.
Cases like Chevrolet’s ChatGPT-powered bot and Air Canada’s misleading refund policy highlight the importance of rigorous testing. What AI says is as legally binding as what a human representative would say—so businesses must set clear ethical boundaries and risk tolerance before deploying generative AI.
Finding the Right Personality
Every AI agent has a personality, whether intentional or not.
A retail assistant might be casual and friendly, using emojis and exclamation marks, while a financial services AI should be more professional and precise. Subtle differences in tone and phrasing can make AI feel like a natural extension of the brand. Copywriting skills are essential.
More Than Just Words
Good AI design isn’t just about text. Visual cues, such as typing indicators and message delays, can improve the user experience. If the system needs time to process, a simple loading indicator reassures users that something is happening. In customer interactions, small touches like emoji reactions or bold text can help guide the conversation effectively.
You’d be surprised how powerful this can be. I must have done 500 Nibble demos on video calls over the last few years, and just watching someone’s face when Nibble replies with an emoji has convinced me that these small signals can completely shift user perception. In fact, we use them to signal how close we are to agreement. A crying emoji often shocks users at first. But then they see Nibble as a real counterpart and adjusting their stance rather than pushing an extreme position without concession.
The Takeaways when designing AI with empathy
- Don’t rush it. A poorly designed AI agent is worse than none at all.
- Test, test, and test again—especially for generative AI implementations.
- Keep it simple and direct. Clear messaging prevents confusion.
- Listen to your users. AI interactions are a goldmine of customer insights—use them.
AI negotiation isn’t just a digital tool; it’s an extension of your brand’s customer experience. When designed with empathy, it can transform frustrating interactions into meaningful, seamless conversations and result in long-term customer loyalty.
Find out more from Nibble's experience negotiating 100,000 times a month here.
Interested in Nibble?