For Clients

What to ask before hiring an AI-assisted creator

What to ask before hiring an AI-assisted creator
Team TBM
Team TBM
Apr 20, 20268 min read

Only 31% of creators say they always disclose AI use to clients. That’s from Envato’s “Beyond Adoption: State of AI in Creative Work 2026,” a survey of 1,780 working creatives published in November 2025. Among agency and studio professionals specifically, the number drops further — 58% have used AI in client deliverables without disclosing it.

This isn’t a crisis. It’s a gap — between how fast AI tools have entered creative work and how slowly the disclosure norms have followed. If you’re hiring AI-assisted creative talent, the questions to ask have changed. Most vetting frameworks haven’t caught up.

Here’s what to ask — and why.

Quick answer

Before hiring an AI-assisted creator, ask 8 questions across three areas:

  • Creative accountability (3 questions): Does this creator direct the AI, or does the AI direct the work?
  • Disclosure practices (3 questions): Do they have a consistent standard for revealing AI use?
  • IP and legal (2 questions): Do they understand copyright, and are their tools licensed for commercial work?

The full questions and how to read the answers are below.

Why the old questions don’t work anymore

The classic creative interview covered taste, process, and portfolio. “What’s your process?” used to tell you a lot. Today it often tells you less.

“My process” can mean anything from a creator who spends three days in conceptual development before touching a tool, to one who pastes your brief into a generator, lightly edits the output, and ships it. Both might describe their work as “research, ideation, and refinement.” Both might show you a polished portfolio.

The difference isn’t in the output. It’s in the human contribution — the deliberate choices, the judgment calls, the decisions the AI didn’t make. That’s what you’re paying for. The traditional questions weren’t designed to surface it.

The questions below are. They’re not gotchas. They give good creators an opportunity to explain what makes their work theirs — and give you the information to make a confident hire.

The questions, in three groups

Creative accountability: testing the human’s role

“Walk me through a creative decision you made that the AI didn’t suggest.”

This is the most revealing question you can ask when hiring AI-assisted creative talent. You’re not testing whether they used AI — you’re testing whether they directed it.

A strong answer is specific: a structural choice they made, a direction they rejected, a constraint they introduced. A weak answer stays vague — “I always put my own spin on it” — or describes the AI’s output rather than their own contribution. If a creator can’t name a single decision they made that wasn’t AI-prompted, that’s the answer.

“How do you quality-check AI output before it reaches me?”

This is about craft discipline. You want to know what the creative layer looks like between generation and delivery.

A strong answer describes a repeatable process: what they check for, where they rewrite, what they test against. A weak answer is a value statement — “I always make sure it’s good” — with nothing behind it. The creator who can walk through their QA steps is genuinely in control of the work.

“Have you ever pushed back on a client brief because you knew AI couldn’t serve it well?”

A creator who uses AI as a tool — not a default — has a view on when it’s the wrong tool.

A strong answer includes a real example: a project that needed lived experience, nuanced tone, or proprietary context where they had that honest conversation. A weak answer hedges — “AI can do almost anything well these days.” That might be arguable. It’s not an honest answer about judgment.

Transparency and disclosure: testing standards awareness

“What does your AI disclosure look like in a deliverable — and how does it change based on what the client needs?”

You’re checking two things: whether the creator has a disclosure practice at all, and whether they think about context. Disclosure appropriate for a mood board may not be appropriate for content representing your brand.

A strong answer distinguishes between types of deliverables and types of use. The IAB AI Transparency and Disclosure Framework, published January 2026, sets voluntary industry guidance that disclosure should reflect when AI “materially affects authenticity, identity, or representation.” A creator who’s thought about this will recognize what that means in practice. A weak answer gives you a blanket statement — “I disclose AI use upfront” — without any nuance about how that changes when the stakes change.

“If I asked to see the original brief and the first AI draft alongside the final, what would I see?”

You probably don’t need the documentation. You’re gauging willingness. A creator comfortable with transparency will walk you through the gap — here’s the raw output, here’s what changed, here’s why. That’s a creator who owns the work.

A creator who gets defensive at the question itself is telling you something before you’ve even seen the draft.

“If a client later had concerns about AI involvement, how would you handle that conversation?”

This is the accountability question. Given what the Envato data shows about disclosure rates, you want to know: does this creator take responsibility, or deflect?

A strong answer shows they’ve thought about it: a clear position on disclosure, a willingness to have the conversation. A weak answer minimizes the scenario or treats client concern as a misunderstanding to correct. Accountability shows up here before it ever needs to show up on a project.

IP and legal exposure: testing downstream risk

Note: The guidance below is editorial, not legal advice. For specific questions about copyright ownership, licensing, or contractual rights, consult a qualified IP attorney.

“Who owns the copyright on the work you deliver to me?”

This question has a more complicated answer than it used to.

The US Copyright Office’s January 2025 guidance established that copyright protection requires traceable human authorship. AI-generated elements — where no human made the creative choices — don’t qualify for copyright protection on their own. That doesn’t automatically mean you own nothing, but the more AI-generated the work, the thinner the claim. A creator who can explain this, and identify which elements reflect traceable human decisions, understands the terrain.

A weak answer is a blanket reassurance: “You own everything once I hand it over.” That’s a contract clause, not an IP analysis.

“What AI tools do you use, and do you have usage rights to deploy them commercially?”

Not all AI tools are licensed for commercial client work. Some consumer-tier subscriptions prohibit commercial use of outputs.

A strong answer names the tools and confirms they’re on commercial plans or have reviewed the terms. A weak answer is vague about which tools, or waves off the question as a technicality. If the creator is delivering work built on tools they’re not licensed to deploy commercially, that exposure follows the work — not just the creator.

The 2026 context: disclosure is becoming a standard

Two frameworks are worth knowing about — not because you need to become a compliance expert, but because they signal where professional expectations are heading.

The IAB AI Transparency and Disclosure Framework (January 2026) sets voluntary industry guidance for AI disclosure in advertising and branded content. It’s not law. But it reflects where professional norms are moving.

The EU AI Act Article 50, enforceable from August 2, 2026, requires AI-generated content to be marked in machine-readable format for EU-market operations. If your campaigns or branded materials reach European audiences, this applies. For US-only engagements, it doesn’t directly — but the trend it represents does.

Disclosure is moving from something thoughtful creators do toward a professional expectation. Clients who don’t ask are absorbing avoidable risk.

Your vetting checklist

  • Names a specific creative decision the AI didn’t make
  • Describes a repeatable QA process (not just “I make sure it’s good”)
  • Has pushed back on at least one brief the AI couldn’t serve well
  • Disclosure practice accounts for different deliverable types
  • Willing to show raw AI output alongside final work
  • Takes clear accountability for client concerns about AI use
  • Can explain copyright ownership in terms of human authorship
  • Confirms commercial licenses for all AI tools used

A creator who walks through all eight with specifics understands their work — and respects yours.

How TBM approaches this

At The Blue Mango, we work with creators on both sides of this conversation. We know what good AI-assisted practice looks like, and we’ve built transparency standards into how we work — so you’re not discovering the rules mid-project.

When you hire through TBM, disclosure practices are part of how every engagement is structured. The questions above are ones we’ve already asked.

If you want help thinking through your next creative hire or project — AI-assisted or traditional — talk to us before it starts. We’ll help you scope it, vet it, and make sure everyone’s on the same page from day one.