Your AI Isnât Stupid. Your Prompts Are.
Okay, letâs be real here. Youâve totally been there, right? I was literally just doing this yesterday - staring at my laptop at 2 AM, getting increasingly frustrated with ChatGPT because it kept giving me these super generic marketing ideas that sounded like they came from a 2015 blog post.
I needed fresh, innovative campaign ideas for a fintech client, and what I got was âcreate engaging social media contentâ and âleverage influencer partnerships.â No kidding, ChatGPT. Groundbreaking stuff.
And yeah, my first instinct was to blame the tool. âThis thing is overhyped,â I muttered, closing the tab and going back to manually writing everything myself like some kind of digital caveman.
But hereâs what I realized (and it kinda stung): The AI wasnât the problem. My lazy-ass prompts were.
Think about it - founders who just throw random questions at AI like itâs some magic 8-ball get exactly what youâd expect: vague, useless garbage. But the smart ones? They treat it more like that super literal intern whoâs actually brilliant but needs really specific instructions. Those founders are getting 10x results while the rest of us are still complaining about âAI limitations.â
Iâve watched this play out with dozens of founders in our community. The ones who get incredible results from AI arenât necessarily more technical or smarterâtheyâre just better at asking questions. Theyâve developed a skill that most of us never needed before: the art of prompt engineering.
The Real Cost of Bad Prompts (And Why I Almost Gave Up)
Look, every crappy prompt you send is basically lighting your time on fire. And time is literally the only thing we canât get more of as founders.
Last week, I tracked how much time I spent trying to get a decent content strategy from ChatGPT with vague prompts versus how long it took when I finally figured out how to ask properly. The difference? 2 hours versus 20 minutes. Thatâs 1 hour and 40 minutes Iâll never get backâtime I could have spent on literally anything else.
My buddy Alex (yeah, real name because he said I could share this) made this exact mistake last month. Dude literally typed âwrite a business planâ into Claude and hit enter. Thatâs it. No context, no specifics, nothing.
What did he get? Ten pages of the most generic business plan template youâve ever seen. Like, the kind of stuff that makes investors physically cringe. Alex spent an entire Saturday trying to salvage it before finally giving up and starting over.
His mistake wasnât using AI - it was being lazy about it. He basically walked up to a world-class consultant and said âhelp me with business stuffâ then wondered why the advice sucked.
I called him after he texted me a screenshot of this disaster. âWhat exactly did you ask it to write?â I said. When he told me, I couldnât help laughing. âDude, thatâs like walking into a restaurant and just saying âfoodâ and then being disappointed when they donât bring you exactly what you wanted.â
(Side note: Alex now uses our platform at EvaluateMyIdea.AI and his prompting game has gotten way better. But Iâm getting ahead of myselfâŚ)
How to Actually Get Good Answers (3 Techniques That Work)
Getting useful stuff from AI is honestly a skill, and I wish someone had taught me this earlier. Itâs less about âasking questionsâ and more about being a really good director.
Hereâs what Iâve learned works (and what doesnât):
1. Give the AI a Job Title (Seriously)
This sounds weird but stick with me. Instead of asking generic questions like âShould I lower my SaaS pricing?â (which is what I used to do), try this approach:
âYouâre a SaaS pricing consultant whoâs worked with 50+ early-stage B2B startups over the past 15 years. My product is a project management tool specifically for architects. Weâre at $50/user/month right now, seeing about 5% monthly churn. Walk me through the pros and cons of dropping to $40 to boost acquisition.â
See the difference? Youâve basically hired a consultant instead of asking your friend for random advice. The AI now has a specific expertise to draw from, context about your situation, and a clear task.
I started doing this after reading about âpersona promptingâ somewhere (canât remember where) and the quality jump was immediate. Last month, I was struggling with our email sequence for new users. Nothing was working. Then I tried: âYouâre an email marketing specialist whoâs worked with SaaS companies for 10 years. Hereâs our current welcome sequence [pasted emails]. Our goal is to increase activation rates. What specific changes would you recommend?â
The response was so good I implemented it almost verbatim, and our activation rate jumped 22% in two weeks.
2. Make It Show Its Work (Like High School Math Class)
You know how math teachers always made you show your work? Same principle applies here. If you just ask AI to âanalyze my competitorâs website,â youâll get some surface-level BS that doesnât help anyone.
But if you break it down step-by-step:
âAnalyze [competitor URL] landing page like this:
- Whatâs their main headline and value prop?
- What features do they emphasize most?
- Whatâs their primary CTA?
- What social proof are they using?
- Based on all this, whatâs their conversion strategy?â
This approach forces the AI to actually think through the problem methodically instead of just word-vomiting a summary. I learned this the hard way after getting a bunch of useless competitor analyses that were basically âthey have a nice website and good colors.â
Now I get actual insights I can use.
I tried this recently when we were redesigning our pricing page. Instead of asking for general feedback, I had the AI analyze our page section by section, element by element. It spotted three specific issues Iâd completely missed, including a confusing feature comparison that was probably costing us conversions.
3. The RICE Framework (My Secret Weapon for Big Tasks)
For the really important stuff - like when you need a solid marketing plan or youâre prepping for investor meetings - I use what I call the RICE framework. I actually made this up after screwing up too many important prompts, but it works.
Role - Who is the AI pretending to be? Instruction - What exactly do you want it to do? Context - All the background info it needs Example - Show it what good looks like
So instead of âhelp me with marketing,â try:
Role: âYouâre a content marketing director whoâs scaled 3 B2B SaaS companies from 0 to $10M ARRâ
Instruction: âCreate a 90-day content strategyâ
Context: âMy target audience is startup founders, budget is $5K/month, main competitor is [X]â
Example: âGood output would include blog topics, one webinar, and 2 case studies with specific timelinesâ
I call these âmegapromptsâ and honestly, theyâve saved me so much time. The difference between this and lazy prompting is night and day.
Last quarter, I used this exact framework to create our content calendar. The AI suggested a series on âfounder decision frameworksâ that I never would have thought of. That series ended up driving more signups than anything else we published all year.
Stop Wasting Time (Seriously, Just Stop)
Look, your startup canât afford for you to figure this out through trial and error. Iâve seen too many founders burn weeks getting garbage outputs because they never learned to prompt properly.
Bad prompts â bad data â bad decisions â dead startup. Itâs that simple.
I watched a founder friend waste three weeks trying to use AI to help with his financial projections. He kept getting unusable outputs and was ready to give up on AI entirely. I spent 15 minutes helping him reframe his prompts, and suddenly he was getting detailed, actionable financial models that actually made sense for his business.
Learning to prompt well isnât some nice-to-have skill anymore. Itâs literally a competitive advantage. While other founders are still fighting with AI tools, youâll be getting expert-level insights in minutes.
Thatâs actually why we built EvaluateMyIdea.AI the way we did. Instead of making you figure out the perfect prompts, our AI agents already know what questions to ask. They pull the right context out of your head automatically and give you a proper evaluation. Weâve basically done all the hard prompting work upfront.
But even with our platform, the founders who provide better context get better results. Itâs inescapableâgarbage in, garbage out still applies, even with the fanciest AI.
Quick Self-Check Before Your Next AI Chat
Next time youâre about to ask AI for help, pause for a second:
- Did you give it a specific role to play?
- Did you include all the context it needs?
- Did you break down what you want step-by-step?
If you answered no to any of these, youâre basically just playing around instead of actually using the tool.
Iâve started keeping a swipe file of my best promptsâthe ones that produced results I actually used. Whenever I need to do something similar, I donât reinvent the wheel. I grab the prompt template that worked before and adapt it. This alone has probably saved me 5+ hours a week.
Remember: AI isnât magic. Itâs a tool. And like any tool, your results depend on how well you use it.
Frequently Asked Questions
Q: How do I prompt AI to get better answers for my startup?
A: Give the AI a specific role, provide clear instructions, include all relevant context, and break down your request step-by-step for the best results.
Q: What are common mistakes founders make when using AI?
A: Asking vague or generic questions, not providing enough context, and expecting expert-level answers without clear direction.
Q: Why is prompting important for AI for startups?
A: Effective prompting turns AI into a powerful co-founder, saving time and providing actionable insights, while poor prompts lead to generic or useless outputs.
Q: Can AI replace human expertise in business decisions?
A: AI can accelerate research and provide valuable feedback, but human judgment and domain expertise are still essential for critical decisions.