
Banff, Alberta hosted Canada's first AI and Culture Summit March 18, bringing together artists, musicians, writers, technologists, and policymakers to debate artificial intelligence's impact on creative industries as cultural professionals demand stronger copyright protections against AI companies training models on copyrighted works without permission or compensation, CBC reported.
The conference addressed tensions between AI's potential as creative tool and threats to cultural workers' livelihoods as generative systems produce music, visual art, writing, and video at costs undermining human creators' ability to earn sustainable incomes from cultural production. Summit discussions centered on whether regulation can balance innovation benefits against protecting artists whose work trains the AI systems potentially replacing them.
Creative Industries Face Existential AI Challenges
Artists attending the summit emphasized that AI companies built billion-dollar businesses by training models on copyrighted creative works scraped from the internet without seeking permission or offering compensation. Musicians pointed to AI systems generating songs mimicking specific artists' styles after training on their complete discographies, visual artists showed AI-generated images replicating distinctive techniques learned from their portfolios, and writers demonstrated chatbots producing content in their narrative voices.
These capabilities threaten creative professionals' economic viability as clients increasingly choose AI-generated content costing pennies over hiring human creators charging professional rates. The speed and cost advantages of AI-generated creative work create market dynamics where human artists struggle competing on price while AI systems improve quality by training on the superior human work they're displacing.
Cultural organizations also raised concerns about AI homogenizing creative output as models trained on popular works reproduce dominant aesthetic patterns rather than supporting diverse artistic voices and experimental approaches that don't fit algorithmic patterns learned from training data. This dynamic risks narrowing cultural production toward commercially safe outputs that AI can generate reliably while marginalizing distinctive creative visions.
Demands for Copyright Protection and Compensation
Summit participants called for legislation requiring AI companies to license copyrighted works used in training data, similar to how music streaming services pay royalties to artists whose songs they distribute. Proposals included opt-in systems where creators explicitly authorize AI training on their work rather than current opt-out approaches placing burden on artists to identify and block their content from training datasets.
Artists also advocated for transparency requirements forcing AI companies to disclose what copyrighted materials appear in training data, letting creators identify unauthorized use and seek compensation. Current practices keep training datasets confidential, preventing artists from knowing whether their work was used or proving infringement when AI outputs demonstrate clear stylistic influence from specific creators.
Some participants proposed AI-generated content labeling requirements ensuring audiences can distinguish human-created cultural works from AI outputs, preserving market differentiation and preventing AI systems from passing off derivative work as original human creativity. This parallels existing requirements for disclosing when images are digitally manipulated or when sponsored content appears in editorial contexts.
Technology Industry Resistance to Creative Protections
Technology representatives at the summit argued that requiring licensing for training data would make AI development economically impossible and legally precedent-setting in ways that could restrict other transformative uses of copyrighted material including search engines, academic research, and accessibility tools. They emphasized that AI training transforms copyrighted works rather than copying them, falling under fair use principles.
Industry participants also claimed that AI tools democratize creative production by letting people without traditional artistic training express ideas visually, musically, or narratively. This framing positions AI as expanding creative participation rather than threatening professional artists, though creative professionals counter that displacement of sustainable creative careers ultimately reduces cultural production quality and diversity regardless of expanded amateur access.
Policy Implications for Canadian Cultural Industries
The summit reflects Canadian government recognition that AI poses unique challenges for cultural industries the country has historically protected through content quotas, funding programs, and copyright frameworks supporting domestic creators. Policymakers face pressure balancing technology sector growth against cultural sector sustainability as AI disruption intensifies.



