An AI brain helping a startup founder

Your AI Isn’t Stupid. Your Prompts Are.

Okay, let’s be real here. You’ve totally been there, right? I was literally just doing this yesterday - staring at my laptop at 2 AM, getting increasingly frustrated with ChatGPT because it kept giving me these super generic marketing ideas that sounded like they came from a 2015 blog post.

I needed fresh, innovative campaign ideas for a fintech client, and what I got was “create engaging social media content” and “leverage influencer partnerships.” No kidding, ChatGPT. Groundbreaking stuff.

And yeah, my first instinct was to blame the tool. “This thing is overhyped,” I muttered, closing the tab and going back to manually writing everything myself like some kind of digital caveman.

But here’s what I realized (and it kinda stung): The AI wasn’t the problem. My lazy-ass prompts were.

Think about it - founders who just throw random questions at AI like it’s some magic 8-ball get exactly what you’d expect: vague, useless garbage. But the smart ones? They treat it more like that super literal intern who’s actually brilliant but needs really specific instructions. Those founders are getting 10x results while the rest of us are still complaining about “AI limitations.”

I’ve watched this play out with dozens of founders in our community. The ones who get incredible results from AI aren’t necessarily more technical or smarter—they’re just better at asking questions. They’ve developed a skill that most of us never needed before: the art of prompt engineering.

The Real Cost of Bad Prompts (And Why I Almost Gave Up)

Look, every crappy prompt you send is basically lighting your time on fire. And time is literally the only thing we can’t get more of as founders.

Last week, I tracked how much time I spent trying to get a decent content strategy from ChatGPT with vague prompts versus how long it took when I finally figured out how to ask properly. The difference? 2 hours versus 20 minutes. That’s 1 hour and 40 minutes I’ll never get back—time I could have spent on literally anything else.

My buddy Alex (yeah, real name because he said I could share this) made this exact mistake last month. Dude literally typed “write a business plan” into Claude and hit enter. That’s it. No context, no specifics, nothing.

What did he get? Ten pages of the most generic business plan template you’ve ever seen. Like, the kind of stuff that makes investors physically cringe. Alex spent an entire Saturday trying to salvage it before finally giving up and starting over.

His mistake wasn’t using AI - it was being lazy about it. He basically walked up to a world-class consultant and said “help me with business stuff” then wondered why the advice sucked.

I called him after he texted me a screenshot of this disaster. “What exactly did you ask it to write?” I said. When he told me, I couldn’t help laughing. “Dude, that’s like walking into a restaurant and just saying ‘food’ and then being disappointed when they don’t bring you exactly what you wanted.”

(Side note: Alex now uses our platform at EvaluateMyIdea.AI and his prompting game has gotten way better. But I’m getting ahead of myself…)

How to Actually Get Good Answers (3 Techniques That Work)

Getting useful stuff from AI is honestly a skill, and I wish someone had taught me this earlier. It’s less about “asking questions” and more about being a really good director.

Here’s what I’ve learned works (and what doesn’t):

1. Give the AI a Job Title (Seriously)

This sounds weird but stick with me. Instead of asking generic questions like “Should I lower my SaaS pricing?” (which is what I used to do), try this approach:

“You’re a SaaS pricing consultant who’s worked with 50+ early-stage B2B startups over the past 15 years. My product is a project management tool specifically for architects. We’re at $50/user/month right now, seeing about 5% monthly churn. Walk me through the pros and cons of dropping to $40 to boost acquisition.”

See the difference? You’ve basically hired a consultant instead of asking your friend for random advice. The AI now has a specific expertise to draw from, context about your situation, and a clear task.

I started doing this after reading about “persona prompting” somewhere (can’t remember where) and the quality jump was immediate. Last month, I was struggling with our email sequence for new users. Nothing was working. Then I tried: “You’re an email marketing specialist who’s worked with SaaS companies for 10 years. Here’s our current welcome sequence [pasted emails]. Our goal is to increase activation rates. What specific changes would you recommend?”

The response was so good I implemented it almost verbatim, and our activation rate jumped 22% in two weeks.

2. Make It Show Its Work (Like High School Math Class)

You know how math teachers always made you show your work? Same principle applies here. If you just ask AI to “analyze my competitor’s website,” you’ll get some surface-level BS that doesn’t help anyone.

But if you break it down step-by-step:

“Analyze [competitor URL] landing page like this:

  1. What’s their main headline and value prop?
  2. What features do they emphasize most?
  3. What’s their primary CTA?
  4. What social proof are they using?
  5. Based on all this, what’s their conversion strategy?”

This approach forces the AI to actually think through the problem methodically instead of just word-vomiting a summary. I learned this the hard way after getting a bunch of useless competitor analyses that were basically “they have a nice website and good colors.”

Now I get actual insights I can use.

I tried this recently when we were redesigning our pricing page. Instead of asking for general feedback, I had the AI analyze our page section by section, element by element. It spotted three specific issues I’d completely missed, including a confusing feature comparison that was probably costing us conversions.

3. The RICE Framework (My Secret Weapon for Big Tasks)

For the really important stuff - like when you need a solid marketing plan or you’re prepping for investor meetings - I use what I call the RICE framework. I actually made this up after screwing up too many important prompts, but it works.

Role - Who is the AI pretending to be? Instruction - What exactly do you want it to do? Context - All the background info it needs Example - Show it what good looks like

So instead of “help me with marketing,” try:

Role: “You’re a content marketing director who’s scaled 3 B2B SaaS companies from 0 to $10M ARR” Instruction: “Create a 90-day content strategy”
Context: “My target audience is startup founders, budget is $5K/month, main competitor is [X]” Example: “Good output would include blog topics, one webinar, and 2 case studies with specific timelines”

I call these “megaprompts” and honestly, they’ve saved me so much time. The difference between this and lazy prompting is night and day.

Last quarter, I used this exact framework to create our content calendar. The AI suggested a series on “founder decision frameworks” that I never would have thought of. That series ended up driving more signups than anything else we published all year.

Stop Wasting Time (Seriously, Just Stop)

Look, your startup can’t afford for you to figure this out through trial and error. I’ve seen too many founders burn weeks getting garbage outputs because they never learned to prompt properly.

Bad prompts → bad data → bad decisions → dead startup. It’s that simple.

I watched a founder friend waste three weeks trying to use AI to help with his financial projections. He kept getting unusable outputs and was ready to give up on AI entirely. I spent 15 minutes helping him reframe his prompts, and suddenly he was getting detailed, actionable financial models that actually made sense for his business.

Learning to prompt well isn’t some nice-to-have skill anymore. It’s literally a competitive advantage. While other founders are still fighting with AI tools, you’ll be getting expert-level insights in minutes.

That’s actually why we built EvaluateMyIdea.AI the way we did. Instead of making you figure out the perfect prompts, our AI agents already know what questions to ask. They pull the right context out of your head automatically and give you a proper evaluation. We’ve basically done all the hard prompting work upfront.

But even with our platform, the founders who provide better context get better results. It’s inescapable—garbage in, garbage out still applies, even with the fanciest AI.

Quick Self-Check Before Your Next AI Chat

Next time you’re about to ask AI for help, pause for a second:

  • Did you give it a specific role to play?
  • Did you include all the context it needs?
  • Did you break down what you want step-by-step?

If you answered no to any of these, you’re basically just playing around instead of actually using the tool.

I’ve started keeping a swipe file of my best prompts—the ones that produced results I actually used. Whenever I need to do something similar, I don’t reinvent the wheel. I grab the prompt template that worked before and adapt it. This alone has probably saved me 5+ hours a week.

Remember: AI isn’t magic. It’s a tool. And like any tool, your results depend on how well you use it.


Frequently Asked Questions

Q: How do I prompt AI to get better answers for my startup?
A: Give the AI a specific role, provide clear instructions, include all relevant context, and break down your request step-by-step for the best results.

Q: What are common mistakes founders make when using AI?
A: Asking vague or generic questions, not providing enough context, and expecting expert-level answers without clear direction.

Q: Why is prompting important for AI for startups?
A: Effective prompting turns AI into a powerful co-founder, saving time and providing actionable insights, while poor prompts lead to generic or useless outputs.

Q: Can AI replace human expertise in business decisions?
A: AI can accelerate research and provide valuable feedback, but human judgment and domain expertise are still essential for critical decisions.