Episode Details

Back to Episodes
Unveiling AI's Impact and Challenges at SECTOR 2024 | A SecTor Cybersecurity Conference Toronto 2024 Conversation with Helen Oakley and Larry Pesce | On Location Coverage with Sean Martin and Marco Ciappelli

Unveiling AI's Impact and Challenges at SECTOR 2024 | A SecTor Cybersecurity Conference Toronto 2024 Conversation with Helen Oakley and Larry Pesce | On Location Coverage with Sean Martin and Marco Ciappelli

Episode 503 Published 1 year, 5 months ago
Description

Guests:

Helen Oakley, Director of Secure Software Supply Chains and Secure Development, SAP

On LinkedIn | https://www.linkedin.com/in/helen-oakley

On Twitter | https://x.com/e2hln

On Instagram |https://instagram.com/e2hln

Larry Pesce, Product Security Research and Analysis Director, Finite State [@FiniteStateInc]

On LinkedIn | https://www.linkedin.com/in/larrypesce/

On Twitter | https://x.com/haxorthematrix

On Mastodon | https://infosec.exchange/@haxorthematrix

____________________________

Hosts: 

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

Sean Martin and Marco Ciappelli kicked off their discussion by pondering the intricacies and potential pitfalls of the AI supply chain. Martin humorously questioned when Ciappelli last checked the entire supply chain of an AI session, provoking insightful thoughts about how people approach AI today.

The conversation then shifted as Oakley and Pesce were introduced, with Oakley explaining her role in leading cybersecurity for the software supply chain at SAP and co-founding the AI Integrity and Safe Use Foundation. Pesce shared his expertise in product security research and pen testing, emphasizing the importance of securing AI integrations.

Preventing the AI Apocalypse

One of the session's highlights was the discussion titled "AI Apocalypse Prevention 101." Oakley and Pesce shared insights into the potential risks of AI overtaking human roles and discussed ways to prevent a hypothetical AI apocalypse. Oakley humorously noted her experimentation with deep fakes and emphasized the importance of addressing the root causes to avert catastrophic outcomes.

Pesce contributed by highlighting the need for a comprehensive Bill of Materials (BOM) for AI, pointing out how it differs from traditional software due to its unique reliance on multiple layers, including hardware and software components.

AI BOM: A Tool for Understanding and Compliance

The conversation evolved into a discussion about the AI BOM's significance. Oakley explained that the AI BOM serves as an ingredient list, akin to what you would find on packaged goods. It includes details about datasets, models, and energy consumption—critical for preventing decay or malicious behavior over time.

Pesce noted the AI BOM's potential in guiding pen testing and compliance. He emphasized the challenges that companies face in keeping up with rapidly evolving AI technology, suggesting that AI BOM could potentially streamline compliance efforts.

Engagement at the CISO

Listen Now

Love PodBriefly?

If you like Podbriefly.com, please consider donating to support the ongoing development.

Support Us