Skip to main content
When Not To Use AI
Back to Writing

When Not To Use AI

AI has become a regular part of how I work. It helps me move faster, think more clearly, and cut down on the small stuff that can slow me down. I use it to draft, summarize, analyze, and explore ideas. When I’m intentional about how I use it, AI is a real force multiplier.

The more I rely on AI, the more intentional I’ve had to be about where it fits and where it doesn’t.

Not everything should be automated. Some tasks actually benefit from a little friction. Others need judgment, accountability, or context that no tool can fully grasp. Figuring out when not to use AI is just as important as knowing where it helps.

At the team or company level, moving fast for its own sake risks trust, clarity, and responsibility.

When decisions require ownership

AI is great at surfacing options, spotting patterns, and laying out tradeoffs. It can help me think through scenarios and challenge my assumptions. But it can’t own the outcome of a decision.

Final decisions need real judgment. They mean understanding the human impact, the dynamics of the team, timing, and long-term consequences that rarely show up in the data. As a leader, I’ve found that accountability and decision making go hand in hand.

I use AI to inform my decisions, not to make them for me. When things go sideways, people want clarity and ownership from their leaders - not a technical explanation about what a model said.

When evaluating people and performance

Performance evaluation is one of the most sensitive responsibilities a leader holds.

Performance is shaped by context, constraints, collaboration, growth, and unseen effort. AI can gather facts but not understand the human side.

Performance conversations need discernment, empathy, and trust. They’re as much about listening as they are about analysis. Handing this off to a tool just creates distance at the exact moment when being present matters most.

That’s why I keep performance evaluation a fully human responsibility.

When conversations are difficult or uncomfortable

Some leadership moments are just uncomfortable. Addressing misalignment, giving tough feedback, or working through conflict aren't things you can, or should, try to optimize away.

AI can help prepare by organizing thoughts or clarifying intent, but shouldn't replace direct communication.

Trust is built through presence, honesty, and follow-through. Outsourcing hard conversations undermines the relationships leaders are responsible for maintaining. If a conversation matters, it deserves full attention.

When defining strategy and direction

Strategy isn't just a list of recommendations or trends. It's about making intentional choices: where to focus, where to invest, and what tradeoffs to make.

Those choices are shaped by values, risk tolerance, and long-term direction. They require leaders to get aligned and to say no to good ideas so they can say yes to better ones.

AI can offer perspective and analysis, but shouldn't define strategy or replace leadership judgment. Setting direction remains a human job.

Why these boundaries matter

These boundaries aren’t about holding AI back. They’re about protecting what makes leadership clear and effective.

If AI replaces judgment, accountability gets fuzzy. If it replaces presence, trust slips. Unthinking automation creates distance, not leverage.

Being clear about where AI doesn’t belong helps teams know what to expect and builds trust that we’re using technology responsibly.

Keeping leadership visible

The goal isn’t just more automation. It’s better leadership.

AI is most effective when it sharpens thinking, reduces unnecessary friction, and supports human decision-making. Used this way, it strengthens leadership rather than weakening it.

In summary: Use AI to assist, not replace, human judgment, especially in decisions, performance evaluation, hard conversations, and setting strategy. Setting clear boundaries on AI use ensures leadership remains accountable, trustworthy, and human-focused.