Customer Discovery Interview Prep -- Five AI Opportunities¶
Generated: 2026-03-09 Method: Mom Test (past behavior, spending, specific incidents -- zero hypotheticals)
Table of Contents¶
- On-Premise AI Appliance for Regulated Industries
- Sovereign AI Agent Platform
- AI-Powered Identity/KYC Platform
- Compliance AI Agent for Fintech
- Privacy-First AI Copilot for Developers
- Meta-Discovery: Should I Productize Nanobot or Build New?
General Mom Test Rules (Reference Card)¶
- Never say "would you," "could you," "do you think," or "if we built X..."
- Ask about the LAST TIME something happened, not whether it happens
- Ask about WHAT THEY DID, not what they wish they did
- Ask about MONEY THEY SPENT, not money they'd hypothetically spend
- Compliments are noise. Commitments are signal.
- The meeting was a failure if you leave feeling good but have no concrete next step
1. On-Premise AI Appliance for Regulated Industries¶
Target Personas¶
Persona A: CISO (Chief Information Security Officer) - Titles: CISO, VP of Information Security, Director of Cybersecurity - Where: Mid-to-large law firms (100+ attorneys), regional hospital systems, mid-market financial advisory firms, defense subcontractors (ITAR/CMMC-regulated) - Find them: ISC2 chapters, local ISSA chapters, BSides conferences, CISO executive roundtables, Evanta CISO summits
Persona B: IT Director / VP of Infrastructure - Titles: IT Director, VP of IT Infrastructure, Director of Enterprise Technology, Head of IT Operations - Where: Healthcare systems (2-10 hospitals), regional banks and credit unions, AmLaw 100-200 firms - Find them: HIMSS (healthcare IT), CHIME, local CIO/CTO meetups, Spiceworks community
Persona C: Compliance Officer / Chief Compliance Officer - Titles: CCO, Chief Compliance Officer, Director of Regulatory Compliance, Data Protection Officer (DPO) - Where: Financial services firms, HIPAA-covered entities, defense contractors needing CMMC Level 2+ - Find them: Compliance Week conferences, SCCE (Society of Corporate Compliance and Ethics), IAPP (privacy professionals)
Interview Questions (15)¶
Understanding Current Pain
-
"Walk me through the last time an employee used ChatGPT or a similar AI tool for work. What happened, and how did you find out about it?"
-
"Tell me about the last data security incident or near-miss you dealt with that involved a third-party cloud service. What did that cost you -- in money, time, or both?"
-
"When was the last time you had to block or restrict a cloud-based tool that employees actually wanted to use? What was it, and what was the fallout internally?"
-
"What AI tools are people in your organization actively using right now -- the ones you know about and the ones you suspect they're using without approval?"
-
"Talk me through the last compliance audit or regulatory review where data residency or data handling came up. What did the auditors flag?"
Past Behavior Around Solutions
-
"Have you evaluated any AI solutions in the past 12 months? Walk me through that process -- who was involved, what did you look at, and where did it stall or succeed?"
-
"What's the last piece of infrastructure you deployed on-premise specifically because of regulatory requirements? How long did that take from decision to running in production?"
-
"Tell me about the last vendor you brought in that required a security review. How long did that process take, and what killed deals in the past?"
-
"How are you handling document review, summarization, or knowledge search today -- the tasks where people are most tempted to paste things into ChatGPT?"
Spending and Budget Signals
-
"What did you spend on your last major on-prem infrastructure purchase -- both the hardware and the implementation? Who signed off on it?"
-
"What's your current annual spend on data loss prevention or shadow IT monitoring tools?"
-
"When you've purchased security appliances or compliance tools before, what was the typical procurement cycle -- from first conversation to purchase order?"
Urgency and Decision Process
-
"If I asked your general counsel right now what keeps them up at night regarding AI and data exposure, what specifically would they say? Have they actually said it to you recently?"
-
"Tell me about the last time you had to say 'no' to a business unit that wanted to adopt a new AI tool. What was their reaction, and did they find a workaround?"
-
"Who in your organization is currently pushing hardest to adopt AI tools, and what arguments are they making? How are you responding?"
Red Flags to Watch For¶
- "We're still figuring out our AI strategy" -- means no urgency, no budget allocated. This is a tire-kicker.
- Enthusiasm without specifics -- "That sounds amazing!" but cannot name a single incident where data leakage was a real problem. They're being polite.
- No budget owner identified -- if they can't tell you who approved the last on-prem purchase, they don't have purchasing power.
- "We just blocked ChatGPT and moved on" -- the problem was solved cheaply. No residual pain.
- They haven't evaluated any AI solution at all -- not even cloud ones. Signals the org isn't AI-ready yet. You'd be selling AI adoption AND on-prem, which is two sales.
- "Our compliance team handles that" -- and you're not talking to the compliance team. Redirect or disqualify.
- All hypothetical answers -- "We would probably..." means they haven't actually dealt with it. Past behavior only.
Commitment Escalation Ladder¶
- Time commitment: "Can I come back in two weeks with a short demo? I'll need 30 minutes of your time and one person from your compliance team."
- Introduction commitment: "Who on your security team should I talk to about how you evaluate on-prem appliances? Can you introduce me via email this week?"
- Effort commitment: "I'd like to run a 2-week pilot in your environment. I'll need someone from IT to allocate a rack slot and network port. Can you assign someone?"
- Financial commitment: "We're offering a 90-day paid pilot at $X. This gets you the hardware, setup, and support. Can you get a PO for that?"
- Reputation commitment: "If the pilot goes well, I'd like to write a case study with your firm's name. Is that something you could approve internally?"
Where to Find These People¶
LinkedIn Searches:
- "CISO" AND ("law firm" OR "healthcare" OR "financial" OR "defense") AND ("on-premise" OR "data residency" OR "compliance")
- "IT Director" AND "HIPAA" AND ("hospital" OR "health system")
- "Chief Compliance Officer" AND ("fintech" OR "bank" OR "credit union")
- Filter by geography if targeting local-first: your metro area + 100 miles
Communities and Events: - ISC2 Community forums and local chapters - ISSA (Information Systems Security Association) local chapters -- monthly meetings, easy to attend - BSides security conferences (city-specific, e.g., BSides SF, BSides NYC) - Evanta CISO Executive Summits (invite-only but you can sponsor) - HIMSS annual conference and local chapter events (healthcare IT) - CHIME (College of Healthcare Information Management Executives) - Defense industry: NDIA (National Defense Industrial Association) events, CMMC Marketplace - IAPP (International Association of Privacy Professionals) -- KnowledgeNet chapters
Other Channels: - Podcasts: CISO Series, Defense & Aerospace Report -- guests are prospects - Slack/Discord: CISO community on Slack, CISOConnect - Reddit: r/cybersecurity, r/netsec (lurk and identify active practitioners)
Cold Outreach Template¶
Subject: How [COMPANY NAME] is handling AI + data residency
Body:
Hi [FIRST NAME],
I've been talking to [CISOs / IT directors / compliance officers] at [regulated industry] firms about how they're dealing with the AI adoption pressure while keeping data on-premise.
Two patterns keep coming up: - Employees using ChatGPT with client data despite policies against it - Compliance teams blocking AI tools entirely, creating friction with business units
I'm building in this space and trying to understand the problem better -- not selling anything yet. I'd value 20 minutes to hear how [COMPANY NAME] is navigating this.
Would [specific day] or [specific day] work for a quick call?
[YOUR NAME]
Why this works: References a real pattern (social proof), is honest about the stage ("not selling"), asks for a small time commitment, and provides specific scheduling options.
2. Sovereign AI Agent Platform¶
Target Personas¶
Persona A: CTO / VP Engineering at Defense Contractors - Titles: CTO, VP Engineering, Chief Architect, Director of Software Engineering - Where: Defense primes and Tier-2 subcontractors (L3Harris, Leidos, Booz Allen, SAIC, Palantir, Anduril and their sub-tier suppliers), ITAR-regulated firms - Find them: AUSA (Association of the United States Army) events, AFCEA conferences, NDIA, DoDIIS, defense tech meetups
Persona B: VP Engineering / Head of Platform at Large Healthcare or Financial Enterprises - Titles: VP Engineering, Head of Platform Engineering, Director of AI/ML Infrastructure, Chief Architect - Where: Health insurance companies, hospital system IT divisions, large banks (not the top 5 -- target tier 2-3 banks where they build internally), asset management firms - Find them: QCon, KubeCon, internal platform engineering meetups, InfoQ, local CTO roundtables
Persona C: Head of AI/ML or AI Platform Lead - Titles: Head of AI, Director of ML Engineering, VP of Applied AI, AI Platform Lead - Where: Enterprises with 500+ engineers that have started building internal AI platforms, especially in regulated sectors - Find them: AI Engineering Summit, MLOps Community meetups, Weights & Biases community, NeurIPS industry track
Interview Questions (15)¶
Current State of AI Deployment
-
"Walk me through how your engineering team is using LLMs today. What's deployed in production versus what's still in experimentation?"
-
"Tell me about the last AI project that got blocked or delayed because of data sovereignty or security review requirements. What happened specifically?"
-
"What's your current architecture for running AI workloads? Show me the stack -- where do the models run, where does the data live, how do prompts get routed?"
-
"When was the last time you evaluated OpenAI's API, Azure OpenAI, or Bedrock for a production use case? What was the decision, and what were the deal-breakers?"
-
"Tell me about the last time an engineer on your team wanted to integrate an AI API into a product and couldn't because of compliance constraints. What did they do instead?"
Self-Hosting Experience
-
"Have you tried self-hosting any LLMs (Llama, Mistral, etc.)? Walk me through that experience -- what worked, what was painful, and where are you now with it?"
-
"What's your current GPU infrastructure look like? What are you running, who manages it, and what's your monthly spend on AI compute?"
-
"Tell me about the last internal tool or platform your team built instead of buying. What drove that decision, and how did it turn out?"
-
"How are you handling RAG (retrieval-augmented generation) today? What's your document ingestion pipeline look like, and what breaks?"
Agent and Tooling Needs
-
"Are any teams building AI agents -- systems that take actions, not just answer questions? Tell me about the most advanced agent use case you've attempted."
-
"What tools and data sources do your AI systems need to connect to? Walk me through a specific workflow where an AI agent would need to call internal APIs or databases."
-
"Tell me about the last time you tried to connect an LLM to an internal system (database, API, ticketing system). What was the integration experience like?"
Spending and Priority
-
"What's your current annual budget for AI infrastructure and tooling -- not research, but production AI systems? How has that changed from last year?"
-
"What did you spend on your last major platform build-vs-buy decision? What was the TCO analysis like?"
-
"If your CEO asked you tomorrow for a progress report on AI deployment, what would you honestly tell them? What are the blockers you'd list?"
Red Flags to Watch For¶
- "We're using OpenAI API and it's fine" -- no data sovereignty pain. They're not your customer unless their compliance team hasn't caught up yet.
- "We have a team of 20 ML engineers building this internally" -- they're building, not buying. You'd be competing with an internal team, which is extremely hard.
- No GPU infrastructure at all -- they'd need to buy hardware AND your software. Two purchases = slow.
- "We're exploring AI" -- too early. They need to be past exploration and into "we tried to deploy and hit walls."
- They can't name a specific blocked project -- the pain isn't acute. Move on.
- "Our cloud provider will solve this" -- they're betting on Azure/AWS/GCP to offer a sovereign solution. You need to understand why that bet might fail.
- Vague on budget -- "We'll find budget if it's the right solution" means there is no budget.
Commitment Escalation Ladder¶
- Time commitment: "I'd like to do a deeper technical session with your platform team -- 60 minutes, whiteboard session. Can you get 2-3 engineers in the room?"
- Introduction commitment: "Who's the person who actually tried to self-host Llama and hit the wall? Can I talk to them directly?"
- Effort commitment: "I'd like to deploy a proof-of-concept in your environment. I need a VM with GPU access and someone who can set up network access to one internal data source. Can you arrange that within two weeks?"
- Design partner commitment: "We're looking for 2-3 design partners who'll shape the product roadmap with us. That means weekly calls for 8 weeks and access to your environment. In exchange, you get the platform at cost for the first year. Can you commit to that?"
- Financial commitment: "Based on what we've discussed, I'm putting together pricing for a 6-month engagement. Who needs to be on the call to discuss budget?"
Where to Find These People¶
LinkedIn Searches:
- "CTO" AND ("defense" OR "ITAR" OR "cleared") AND ("AI" OR "LLM" OR "machine learning")
- "VP Engineering" AND ("self-hosted" OR "on-premise" OR "sovereign") AND "AI"
- "Head of AI" AND ("healthcare" OR "financial services" OR "defense")
- "Platform Engineering" AND ("LLM" OR "AI infrastructure") AND ("director" OR "head" OR "VP")
Communities and Events: - MLOps Community (meetups + Slack) -- active practitioners - AI Engineer Summit / AI Engineer World's Fair - KubeCon (platform engineers who'll run this infrastructure) - QCon (senior engineers and architects) - AFCEA TechNet conferences (defense IT) - DoDIIS Worldwide Conference (defense/intel) - Local CTO/VP Eng roundtables (Pavilion, EO, YPO tech groups) - Hacker News "Who is Hiring" threads -- companies posting GPU/AI infra roles are prospects
Other Channels: - GitHub: Look for companies with repos related to self-hosted LLM deployment (vLLM, text-generation-inference forks) - Job boards: Companies hiring "AI Infrastructure Engineer" or "ML Platform Engineer" are building internally and may be frustrated - Podcasts: Latent Space, Practical AI -- guests are potential prospects - Discord: MLOps Community, Weights & Biases, LangChain
Cold Outreach Template¶
Subject: Self-hosting LLMs at [COMPANY NAME] -- comparing notes
Body:
Hi [FIRST NAME],
I've been working on self-hosted AI agent infrastructure (GPU inference, RAG, tool integration -- all on-prem) and talking to engineering leaders at [defense / healthcare / financial] companies about what's actually working.
The consistent pain I'm hearing: - Self-hosting Llama/Mistral works for demos but breaks at production scale - Building the RAG pipeline, tool integration, and routing layer is 6+ months of platform work - Meanwhile, the business is asking why they can't just use ChatGPT
I'm trying to validate whether this matches your experience at [COMPANY NAME]. Not selling -- just learning. 20 minutes?
[YOUR NAME]
3. AI-Powered Identity/KYC Platform¶
Target Personas¶
Persona A: Head of Compliance / Chief Compliance Officer - Titles: Head of Compliance, CCO, Director of Compliance, BSA/AML Officer - Where: Fintechs post-Series A (they've hit regulatory requirements), neobanks, crypto exchanges with US licenses, online lending platforms - Find them: ACAMS (Association of Certified Anti-Money Laundering Specialists) events, FinCEN outreach sessions, Comply Advantage webinars
Persona B: Head of Risk / VP of Risk - Titles: VP of Risk, Head of Risk, Chief Risk Officer, Director of Fraud & Risk - Where: Payment processors, neobanks, BNPL companies, crypto on-ramps/off-ramps - Find them: Merchant Risk Council events, LendIt Fintech, Money 20/20
Persona C: Head of Operations / VP Ops at Fintechs - Titles: VP of Operations, Head of Ops, COO (at smaller fintechs), Director of Customer Onboarding - Where: Fintechs doing 1,000+ KYC checks/month, marketplace lending platforms, crypto exchanges - Find them: Fintech meetups, SaaStr (fintech track), specific fintech Slack communities
Interview Questions (14)¶
Current KYC Process
-
"Walk me through exactly what happens from the moment a new customer submits their ID document to the moment they're approved or rejected. Every step, every handoff."
-
"How many KYC reviews does your team process per day? Of those, how many require manual review by a human, and why?"
-
"Tell me about the last false positive your KYC system flagged -- a legitimate customer who got blocked. What happened, and how long did it take to resolve?"
-
"What's your current KYC vendor stack? Name every tool in the chain. What does each one cost you per check?"
-
"When was the last time you switched KYC vendors or added a new one? What drove that decision?"
Pain Points and Costs
-
"What's your current cost per KYC check -- including vendor fees, manual review labor, and the cost of false rejections? Have you actually calculated that number?"
-
"Tell me about the last regulatory exam or audit where your KYC process was scrutinized. What feedback did the examiners give you?"
-
"How many customer onboarding sessions are abandoned before completion? Do you track the drop-off rate at the ID verification step specifically?"
-
"What's the most common document type that your current system fails on? What happens to those customers?"
-
"Tell me about the last fraud case that got through your KYC process. When did you discover it, and what was the financial impact?"
Data Sensitivity and Infrastructure
-
"Where does your customer PII live during the KYC process? Walk me through the data flow -- which vendors see which data, and where is it stored?"
-
"Have you ever had a regulator or auditor raise concerns about sending customer identity documents to a third-party API? What was that conversation like?"
-
"What's your current sanctions screening process? How often does your watchlist update, and tell me about the last time a sanctions hit caused a problem."
Budget and Decision Making
- "What's your total annual spend on KYC and identity verification -- vendors, headcount, infrastructure, everything? Who owns that budget?"
Red Flags to Watch For¶
- "We use Jumio/Onfido and it works great" -- satisfied with incumbent, no pain. Unless you dig deeper and find hidden costs.
- "We do fewer than 100 KYC checks a month" -- too small to justify a platform switch. Volume is required for ROI.
- "Our regulator hasn't raised AI/KYC concerns" -- regulatory push isn't there yet for them.
- They can't quantify their cost per check -- they haven't felt the pain enough to measure it.
- "We're pre-revenue / pre-launch" -- they don't have real KYC volume yet. Come back later.
- "We just signed a 3-year contract with [vendor]" -- locked in. Time the next conversation for 18 months out.
- No fraud losses to point to -- either their process works fine or they don't know about the fraud. Both are bad for you.
Commitment Escalation Ladder¶
- Time commitment: "Can I shadow your compliance analyst for a morning to watch the manual review process? I'll sign an NDA."
- Data commitment: "Can you send me 50 anonymized examples of documents your current system fails on? I want to test my approach against your actual failure cases."
- Introduction commitment: "I'd like to talk to whoever manages your Jumio/Onfido relationship. Can you introduce me?"
- Pilot commitment: "I'd like to run my system in parallel with your current vendor for 2 weeks -- same documents, side-by-side comparison. I need API access to your document ingestion pipeline. Can you set that up?"
- Financial commitment: "Based on the parallel run, here's the accuracy comparison and cost savings. I'm proposing a 6-month contract at $X. Can we get your Head of Risk and CFO on a call?"
Where to Find These People¶
LinkedIn Searches:
- "Head of Compliance" AND ("fintech" OR "neobank" OR "crypto" OR "lending")
- "BSA Officer" OR "AML Officer" AND ("fintech" OR "digital bank")
- "Head of Risk" AND ("payments" OR "lending" OR "crypto")
- "VP Operations" AND "fintech" AND ("KYC" OR "onboarding" OR "verification")
Communities and Events: - ACAMS conferences and local chapters (the compliance professional's home base) - Money 20/20 (Las Vegas and Amsterdam) -- the fintech mega-conference - LendIt Fintech (lending-specific) - Merchant Risk Council (MRC) -- fraud and risk professionals - Fintech Meetup (emerging conference, high-quality networking) - Comply Advantage community events - FinCrime World Forum
Other Channels: - Fintech-specific Slack groups: Fintech Collective, On Deck Fintech - Reddit: r/fintech, r/compliance - Substack/newsletters: Fintech Takes, Fintech Business Weekly -- comment sections have practitioners - Crunchbase: Filter for fintechs that raised Series A-C in the last 18 months (they have KYC needs and budget)
Cold Outreach Template¶
Subject: Your KYC manual review rate -- curious how it compares
Body:
Hi [FIRST NAME],
I've been researching KYC operations at fintechs and one number keeps surprising me: the manual review rate. Most teams I've talked to are manually reviewing 15-30% of checks, even with Jumio/Onfido in place.
I'm exploring whether AI can meaningfully reduce that -- especially for the document types that current vendors struggle with (non-US IDs, poor lighting, etc.) -- while keeping PII processing on your infrastructure.
I'm early-stage and learning, not pitching. I'd love 20 minutes to hear how [COMPANY NAME]'s KYC process actually works in practice.
[YOUR NAME]
4. Compliance AI Agent for Fintech¶
Target Personas¶
Persona A: Head of Compliance / BSA Officer - Titles: Head of Compliance, BSA/AML Officer, Chief Compliance Officer, Director of Regulatory Affairs - Where: Series B+ fintechs, payment processors, money transmitters, digital banks, BNPL companies - Find them: ACAMS events, FinCEN Innovation Hours, state regulator roundtables
Persona B: General Counsel / Deputy GC - Titles: General Counsel, Deputy General Counsel, VP Legal, Head of Legal & Compliance - Where: Fintechs with 50-500 employees (big enough for compliance burden, small enough to feel it), crypto companies with state licenses - Find them: ACC (Association of Corporate Counsel) fintech working groups, Legaltech events, in-house counsel meetups
Persona C: VP of Finance / CFO - Titles: CFO, VP Finance, Controller, Head of Finance - Where: Fintechs paying significant compliance costs -- multiple state licenses, SOC2, money transmitter bonds - Find them: CFO Alliance, fintech finance meetups, they're often the budget holders when compliance wants tools
Interview Questions (15)¶
Current Compliance Operations
-
"Walk me through what happens in your compliance department on a typical Monday morning. What reports are being generated, what alerts are being reviewed, what filings are being prepared?"
-
"How many SARs (Suspicious Activity Reports) did your team file last quarter? Walk me through the process of creating one from the initial alert to the final filing."
-
"Tell me about the last regulatory exam you went through. What did the examiners spend the most time on? What findings did they have?"
-
"How many people on your compliance team spend their day on transaction monitoring? What tools are they using, and what does the alert review process look like?"
-
"Tell me about the last time a compliance rule or threshold changed -- for example, a new state requirement or an updated FinCEN guidance. How long did it take to update your systems?"
Pain Points and Specific Incidents
-
"Tell me about the last compliance deadline you almost missed or did miss. What went wrong?"
-
"What's the most tedious, repetitive task your compliance team does? How many hours per week does it consume?"
-
"When was the last time you got a regulatory finding or a consent order? What was the root cause?"
-
"Tell me about the last time your transaction monitoring system generated a spike in false positives. How did your team handle the backlog?"
-
"What's your current false positive rate on transaction monitoring alerts? How many alerts does each analyst review per day?"
Spending and Staffing
-
"How many compliance staff have you hired in the last 12 months? What's your compliance headcount versus revenue ratio? How does your board feel about that?"
-
"What's your current spend on compliance technology -- transaction monitoring, case management, reporting tools? List every vendor."
-
"Tell me about the last compliance tool you purchased. What was the buying process like -- who was involved, how long did it take, and what was the budget?"
Data Sensitivity
-
"Where does your transaction data live? When your compliance tools process it, does it leave your infrastructure? Has a regulator or auditor ever asked about that?"
-
"Have you ever rejected a compliance vendor because of data handling concerns? What specifically was the issue?"
Red Flags to Watch For¶
- "We use Chainalysis/Elliptic/Sardine and it covers everything" -- locked into an incumbent, no gap.
- "We have two compliance people and it's fine" -- not enough scale for automation ROI. They probably need 10+ compliance staff or 50K+ transactions/month.
- "We haven't had a regulatory exam yet" -- too early. They don't know what pain feels like.
- Compliance is outsourced to a law firm -- they're not doing it in-house, so they don't need internal tooling.
- "Our engineering team built our compliance monitoring" -- internal solution exists. Hard to displace.
- They can't tell you their SAR filing count -- either too small or compliance isn't a priority.
- No specific regulatory findings -- if they've never been examined and found wanting, urgency is low.
Commitment Escalation Ladder¶
- Time commitment: "Can I sit with your compliance analyst for half a day and watch them process alerts? I'll sign whatever NDA you need."
- Introduction commitment: "Who manages your relationship with [current compliance vendor]? I'd like to understand what's working and what's not from their perspective."
- Effort commitment: "Can you export a month of anonymized transaction monitoring alerts? I want to show you what an AI agent would do with them versus what your analysts did."
- Pilot commitment: "I want to run my system alongside your current process for 30 days -- same alerts, AI-generated draft SARs alongside your analysts' work. Can you assign one analyst to review the AI outputs?"
- Financial commitment: "Here are the results from the parallel run: X hours saved, Y improvement in detection. I'm proposing a $Z/month engagement. Can we schedule a call with your CFO?"
Where to Find These People¶
LinkedIn Searches:
- "Head of Compliance" AND ("fintech" OR "payments" OR "neobank" OR "lending" OR "crypto")
- "BSA Officer" AND ("fintech" OR "digital bank" OR "money transmitter")
- "General Counsel" AND "fintech" AND ("compliance" OR "regulatory")
- "Chief Compliance Officer" AND ("series B" OR "series C" OR "growth stage") AND "fintech"
Communities and Events: - ACAMS (Association of Certified Anti-Money Laundering Specialists) -- the must-attend for AML compliance - FinCEN Innovation Hours -- direct access to compliance-focused fintech leaders - Fintech Compliance Summit - Money 20/20 compliance track - Comply Advantage community - RegTech-specific events: RegTech Data Summit, Regulation Innovation Conference - State regulator conferences (CSBS Annual Conference, NMLS conference)
Other Channels: - ACAMS Today (newsletter/magazine) -- authors and quoted experts are prospects - Compliance-focused LinkedIn groups - Fintech compliance consultants -- they know who has the worst pain. Ask them for intros. - Job postings: Fintechs hiring multiple compliance analysts simultaneously have scaling pain
Cold Outreach Template¶
Subject: Transaction monitoring alert volume at [COMPANY NAME]
Body:
Hi [FIRST NAME],
I've been talking to compliance teams at fintechs about their transaction monitoring operations. One pattern keeps repeating: teams are drowning in false positives and spending 60-70% of analyst time on alerts that turn out to be nothing.
I'm building AI that can draft SAR narratives, triage alerts, and generate regulatory reports -- running on your infrastructure so transaction data stays in-house.
I'm in research mode and trying to understand how [COMPANY NAME]'s compliance operations actually work day to day. Not selling -- genuinely learning.
Would you have 20 minutes this week or next?
[YOUR NAME]
5. Privacy-First AI Copilot for Developers¶
Target Personas¶
Persona A: Engineering Lead / Director at Defense Contractors - Titles: Director of Software Engineering, Engineering Manager, Lead Software Architect, VP Engineering - Where: Defense primes and subs (Lockheed, Raytheon, L3Harris, Northrop -- and their hundreds of subcontractors), cleared facilities, ITAR-regulated firms - Find them: AFCEA events, defense tech meetups, Secret/TS cleared job boards (indicates cleared dev work)
Persona B: VP Engineering / Head of Developer Experience at Banks - Titles: VP Engineering, Head of Developer Productivity, Director of Developer Experience, Engineering Platform Lead - Where: Banks with 200+ developers (not top 5 -- they build their own), large insurance companies, financial data providers (Bloomberg, FactSet, S&P type firms) - Find them: DeveloperWeek, internal developer productivity conferences, DX community
Persona C: IT Director / ISSM at Government Agencies - Titles: ISSM (Information System Security Manager), IT Director, Branch Chief for Software, Development Lead - Where: Federal agencies doing custom software development, intelligence community, DoD software factories (Kessel Run, Platform One, Kobayashi Maru) - Find them: ATARC, ACT-IAC events, government open source community meetups, government DevSecOps events
Interview Questions (15)¶
Current Developer Tooling
-
"What code completion or AI coding tools are your developers using today? Not what's approved -- what are they actually using?"
-
"Tell me about the last time a developer was caught using GitHub Copilot or ChatGPT to write code in a restricted environment. What happened?"
-
"Walk me through your current developer toolchain for a typical project. IDE, version control, CI/CD, code review -- everything. Where are the gaps?"
-
"How long does it take a new developer to become productive on your codebase? What's the onboarding process look like?"
Past Behavior Around AI Coding Tools
-
"Have you evaluated GitHub Copilot, Cursor, or similar tools in the past year? Walk me through that evaluation. Where did it stall?"
-
"What was the security team's response when Copilot was first requested? What specific objections did they raise?"
-
"Tell me about the last time you tried to get a development tool approved through your security review process. How long did it take, and what was the outcome?"
-
"How are your developers handling code search and understanding legacy codebases today? What do they do when they need to understand a module they've never touched?"
Productivity and Pain
-
"What's your biggest developer productivity bottleneck right now? Not what you think it is -- what does your last developer survey or retro actually say?"
-
"How much time per week do your developers spend writing boilerplate code, documentation, or tests -- the work they'd automate if they could?"
-
"Tell me about the last project that was delayed because of a developer productivity issue. What was the root cause?"
-
"What's your developer attrition rate? In exit interviews, what do developers cite as frustrations with the tooling environment?"
Budget and Infrastructure
-
"What's your current per-developer spend on IDE licenses, tooling, and developer productivity software? What's the total annual budget?"
-
"Do you have any GPU infrastructure on-premise today -- for any purpose? What's there, and who manages it?"
-
"Who decides what tools developers can use? Walk me through the approval chain from a developer requesting a tool to it being available."
Red Flags to Watch For¶
- "We don't allow any AI coding tools and nobody's asking" -- no demand signal. Either the developers don't care (unlikely) or they're not pushing for it.
- "We tried Copilot and the developers didn't like it" -- they may not want any code assistant, regardless of where it runs.
- "Our code isn't complex enough to need AI help" -- small codebase, simple work. The ROI isn't there.
- "We're a Python shop doing CRUD apps" -- the productivity gain from AI coding tools is smaller for simple codebases.
- No GPU infrastructure and no appetite to buy it -- the barrier to entry is too high if they need to buy hardware first.
- "We'll wait for our cloud provider to offer this" -- they're expecting Azure/AWS to solve this. May be right.
- Developers who say "it's fine, I type fast" -- they haven't experienced the productivity difference. Hard to sell what people don't know they're missing.
Commitment Escalation Ladder¶
- Time commitment: "Can I demo this to your development team? I'll need 45 minutes and a conference room. Can you get 5-6 developers in the room?"
- Effort commitment: "I'd like to set up a box in your environment and let 3 developers use it for a week. I need a network port, power, and their IDE configs. Can your IT team handle the physical setup?"
- Measurement commitment: "Can your developers log their usage for two weeks -- tasks completed with the tool vs. without? I'll provide the tracking spreadsheet."
- Pilot commitment: "I want to run a 30-day pilot with 10 developers. I'll provide the hardware and software. I need your IT team to image the box and connect it to your code repositories. Can you commit to that?"
- Financial commitment: "The pilot showed X% productivity improvement across 10 developers. At your fully loaded developer cost, that's $Y saved per year. I'm proposing an annual license at $Z. Who needs to approve this?"
Where to Find These People¶
LinkedIn Searches:
- "Director of Engineering" AND ("defense" OR "cleared" OR "ITAR" OR "government")
- "VP Engineering" AND ("bank" OR "financial services" OR "insurance") AND "developer productivity"
- "Engineering Manager" AND ("DoD" OR "defense contractor" OR "government") AND ("software" OR "development")
- "ISSM" OR "Information System Security Manager" AND ("software development" OR "DevSecOps")
Communities and Events: - AFCEA TechNet events (defense IT + development) - DoD software factory community events (Platform One, Kessel Run community) - DevSecOps Days (government-focused DevOps) - DeveloperWeek / DevRelCon (developer productivity track) - ACT-IAC (American Council for Technology / Industry Advisory Council) - ATARC (Advanced Technology Academic Research Center) - Defense Entrepreneurs Forum (DEF)
Other Channels: - Job postings: Any defense contractor hiring developers and mentioning "air-gapped" or "classified" environments - Government DevSecOps Slack communities - r/sysadmin and r/cybersecurity -- threads about air-gapped development - Conference speakers at defense software events -- they're senior enough to buy and willing to talk publicly
Cold Outreach Template¶
Subject: AI code completion without code leaving your network
Body:
Hi [FIRST NAME],
I keep hearing the same frustration from engineering leaders at [defense / financial / government] organizations: your developers want GitHub Copilot, your security team says no, and you're stuck in the middle.
I'm building a self-hosted code assistant that runs entirely on local GPU hardware -- code never touches the internet. Think Copilot, but deployed like an on-prem appliance.
I'm talking to engineering leaders who've gone through the Copilot evaluation and gotten blocked by security. If that's happened at [COMPANY NAME], I'd love to hear the story.
20 minutes -- no pitch, just learning. Would [day] or [day] work?
[YOUR NAME]
6. Meta-Discovery: Should I Productize Nanobot or Build New?¶
This section addresses the strategic question of whether to package your existing AI agent (nanobot) as a product versus building something purpose-built for a specific market.
Who Would Pay for a Self-Hosted AI Agent Platform?¶
High-likelihood buyers (based on pattern matching from opportunities above): - Defense contractors with ITAR/CMMC requirements who are trying to build AI capabilities - Healthcare systems with PHI constraints who want AI but can't use cloud APIs - Financial firms with data residency requirements (especially non-US firms subject to GDPR, etc.) - Government agencies and system integrators building on classified networks
The real question is: are they paying for the platform, or for the solution the platform enables?
You need to discover whether buyers want: - (A) A platform they configure themselves (developer/platform team buyer) - (B) A turnkey solution for a specific use case (business buyer)
These are fundamentally different products, buyers, and sales motions.
Interview Questions for Enterprise CTOs / IT Leaders (15)¶
Current AI Deployment Reality
-
"What AI capabilities are actually in production in your organization today -- not pilots, not experiments, but serving real users or customers?"
-
"Tell me about the last AI project that went from prototype to production. How long did that take, and what were the biggest bottlenecks?"
-
"Walk me through your current AI/ML infrastructure. Where do models run, who maintains them, and what's the ops burden?"
-
"How many different LLMs or AI models are you running in production? How do you manage model updates and version changes?"
Build vs. Buy Signals
-
"Tell me about the last major platform decision your team made -- build internally or buy a vendor solution. What drove that decision?"
-
"What's your team's experience with self-hosted open-source platforms? Give me a specific example -- what did you deploy, and what was the ongoing maintenance burden?"
-
"When you've bought platform software before (Kubernetes, observability, CI/CD), what made you choose a commercial product over the open-source version?"
-
"How many engineers on your team are currently dedicated to AI infrastructure and tooling versus building AI-powered features? How do you feel about that ratio?"
Agent-Specific Discovery
-
"Are any teams in your organization building AI agents -- systems that take autonomous actions, not just answer questions? Tell me about the most advanced one."
-
"What internal systems do your AI tools need to integrate with? Walk me through a specific workflow that crosses multiple systems."
-
"Tell me about the last time you tried to connect an LLM to an internal tool or API. What was the integration experience, and what broke?"
-
"Have you looked at or used any agent frameworks -- LangChain, CrewAI, AutoGen, or similar? What was that experience like?"
Nanobot-Specific Validation
-
"If I showed you an AI agent platform that included MCP tool integration, multi-LLM routing, and RAG -- all self-hosted -- what use case would you point it at first? [IMPORTANT: Only ask this AFTER they've described their actual problems. This is a validation question, not a leading question.]"
-
"What's the thing your team has spent the most time building in your AI stack that you wish you could have just bought off the shelf?"
-
"Tell me about the last open-source AI tool your team adopted. What was the adoption experience like, and what made you keep using it or abandon it?"
Decision Framework: Productize Nanobot vs. Build New¶
After conducting 15-20 interviews across the five opportunity areas, use this framework:
Productize Nanobot if: - Multiple interviewees describe building the exact same infrastructure nanobot already provides - They specifically mention agent orchestration, MCP tool integration, or multi-LLM routing as things they've built or are building - They express frustration with maintaining internal AI platform code - The buyer is a platform engineering team (technical buyer who wants components) - You keep hearing "we built something like that but it's held together with duct tape"
Build something new (vertical solution) if: - Interviewees care about the outcome (KYC checks, compliance reports, code completion) not the platform - They don't know or care what MCP, RAG, or LLM routing means -- they just want the problem solved - The buyer is a business function (compliance, risk, operations) not a technical team - The pain is in a specific domain, not in "building AI infrastructure" - You hear "I don't care how it works, I just need it to [specific task]"
Hybrid approach signals: - Some buyers want the platform, others want turnkey -- this means two products, two sales motions, two pricing models - If you see this, pick one and commit for 6 months before trying the other
Red Flags That You're Fooling Yourself¶
- You're only talking to technical people who say "cool project" -- that's not a customer, that's a peer
- Nobody will name a dollar amount they're currently spending on the problem
- People are interested in nanobot as open-source but nobody asks about pricing -- they want free, not a product
- You're getting meetings but no one follows up or introduces you to others -- polite but not in pain
- Every conversation requires you to explain what an AI agent is -- the market isn't ready
- You keep hearing "we'll build that ourselves" -- your target customer is a builder, not a buyer
Commitment Tests Specific to Nanobot Productization¶
Run these tests before writing a single line of product code:
- Landing page test: Put up a page describing the self-hosted AI agent platform. Track how many people leave their work email (not Gmail).
- Design partner test: Ask 3 companies to commit to weekly calls for 8 weeks to shape the product. If you can't get 3, the demand isn't there.
- LOI test: Ask 2 companies to sign a non-binding letter of intent to purchase at a stated price point. If they won't sign a non-binding letter, they won't sign a binding one.
- Pilot payment test: Offer a paid pilot ($5K-$15K for 60 days). If nobody will pay for a pilot, nobody will pay for a product.
- Deployment test: Actually deploy nanobot in one company's environment. Track what breaks, what's missing, and how much hand-holding is required. If you're on-site for a week to make it work, it's consulting, not a product.
Appendix A: Interview Logistics Checklist¶
Before Every Interview: - [ ] Research the person's background (LinkedIn, company news, recent regulatory actions) - [ ] Prepare 3 custom questions specific to their company/role - [ ] Decide on your commitment ask for the end of the meeting - [ ] Set up recording (with permission) or have a note-taking system ready - [ ] Have your "I'm not selling" framing rehearsed -- mean it
During Every Interview: - [ ] Shut up and listen. Talk no more than 30% of the time. - [ ] When they say something interesting, ask "tell me more about that" - [ ] When they give a vague answer, ask "can you give me a specific example?" - [ ] When they say something is painful, ask "what did that cost you?" - [ ] Never say "would you use..." or "what if we built..." - [ ] Write down exact quotes, not your interpretation
After Every Interview: - [ ] Write up notes within 1 hour while it's fresh - [ ] Score the conversation: (1) Did they describe a real problem? (2) Are they currently spending money on it? (3) Did they make a commitment? - [ ] Update your pattern tracker: what themes are repeating across interviews? - [ ] Send a thank-you email within 24 hours that references something specific they said - [ ] Follow up on any commitment they made within 48 hours
Appendix B: Interview Scoring Matrix¶
Rate each interview 1-5 on these dimensions:
| Dimension | 1 (Cold) | 3 (Warm) | 5 (Hot) |
|---|---|---|---|
| Problem | Theoretical / no specific incident | Has experienced it but worked around it | Active, ongoing, costly pain |
| Current spend | $0 on the problem | Some budget allocated, using free/cheap tools | Significant spend on partial solutions |
| Urgency | "Someday we'll need this" | "It's on our roadmap this year" | "We needed this yesterday" |
| Authority | Can't influence a purchase decision | Can recommend and champion internally | Can sign a check or directly approve |
| Commitment | Took the meeting but no follow-up | Agreed to introduce someone / attend a demo | Committed time, effort, or money |
Scoring guide: - 20-25 total: Strong prospect. Prioritize. - 15-19: Potential. Needs one more conversation to confirm. - 10-14: Lukewarm. Keep in touch but don't invest heavily. - Below 10: Not a customer right now. Move on.
Appendix C: Cross-Opportunity Pattern Tracking¶
After 5+ interviews per opportunity, fill in this grid to decide where to focus:
| Signal | Opp 1: Appliance | Opp 2: Sovereign Platform | Opp 3: KYC | Opp 4: Compliance Agent | Opp 5: Dev Copilot |
|---|---|---|---|---|---|
| # of interviews where real $ pain cited | |||||
| # who named current spend > $50K/yr | |||||
| # who made a commitment (intro, time, pilot) | |||||
| Average deal size estimate | |||||
| Sales cycle length estimate | |||||
| Can I build an MVP in 4 weeks? | |||||
| Do I have unfair insight/access? |
Decision rule: Pick the opportunity where you have the highest count of real-dollar pain AND the shortest path to a paid pilot. Not the one that's most technically interesting.