Avsnitt

  • Open source technologies are transforming how businesses manage real-time data on cloud platforms. By leveraging flexible, scalable, and cost-effective open-source tools, organisations can process and analyse large volumes of data with speed and precision. These technologies offer unmatched transparency, customisation, and community-driven innovation, making them ideal for real-time monitoring, analytics, and IoT applications.

    As data demands grow, open-source solutions ensure that businesses stay agile, reduce vendor lock-in, and maintain full control over their cloud infrastructure. The result? Faster insights, smarter decision-making, and enhanced performance—all powered by open source. 

    In this episode, Paulina Rios Maya, Head of Industry Relations at EM360Tech, speaks to Mikhail Epikhin, Chief Technology Officer at Double Cloud, about The Power of Open Source in Cloud Platforms. 

    Key Takeaways:

    Open-source technologies provide standard building blocks for products.Community-driven innovation is essential for the evolution of technology.Flexibility in data infrastructure is crucial for real-time processing.Observability and monitoring are vital for performance optimisation.Managed services can accelerate product development and feature implementation.

    Chapters:

    00:00 - The Power of Open Source in Cloud Platforms

    05:24 - Apache Airflow: Enhancing Real-Time Data Management

    10:08 - Balancing Open Source and Managed Services

    13:57 - Best Practices for Scalability and Performance

  • Big Data LDN 2024, the UK’s leading data, analytics, and AI event, is less than a week away – promising two days filled with ground-breaking stories, expert insights, and endless innovation.

    Taking place at the Kensington Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers sharing insights on some of the industry’s hottest topics – from generative AI to data analytics and privacy. 

    With the event less than a week away, EM360Tech’s Head of Podcast Production, Paulina Rios Maya, grabbed Big Data LDN’s Event Director, Andy Steed, for a chat about his expectations for this year’s event and its growing importance in the data world.

    In the episode, they discuss: 

    The exciting themes or breakthroughs attendees can expect to see showcased this yearHow Big Data London remains relevant in such a rapidly evolving fieldThe unique networking opportunities or interactive experiences attendees have at the conferenceThe standout sessions or keynote speakers at the conference
    Chapters:

    00:00: Introduction to Big Data LDN 2024

    01:35: Showcasing Data Stories, Transformations, and Challenges

    02:33: The Networking Opportunities with Industry Leaders and Peers at Big Data LDN 2024

    05:01: Staying Relevant with a Focus on Generative AI and Real-World Use Cases

    06:55:The Importance of Data Events for Community Building and Learning

    About Big Data LDN 2024

    Big Data London is the UK's largest data and analytics event, attracting over 16,500 visitors each year. Taking place at the Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers across the two-day conference. 

    Attendees can meet face-to-face with tech providers and consultants to find solutions to your data challenges and view the latest product releases and software demos to enhance your business' data capabilities.

    It’s also a great opportunity for attendees to strengthen their business network with new and existing partners, and immerse themselves within the data community and network with speakers, colleagues and practicioners all in 2 days at Big Data LDN.

  • Saknas det avsnitt?

    Klicka här för att uppdatera flödet manuellt.

  • Sustainable sourcing is essential for businesses committed to environmental and social responsibility, but achieving it requires accurate and reliable data. Master Data Management (MDM) ensures that all sourcing data—such as supplier information, certifications, and compliance records—is consistent and up-to-date. This enables organisations to make informed decisions that align with their sustainability goals, reduce waste, and promote ethical practices throughout their supply chain.

    MDM is the foundation of a successful sustainability strategy. By providing a single source of truth for all critical data, MDM helps businesses monitor and track their sustainability efforts effectively. With accurate data, companies can identify opportunities to improve resource efficiency, reduce carbon footprints, and ensure compliance with environmental standards, ultimately leading to a more sustainable and resilient business model.

    In this episode, George Firican, Founder of LightsOnData, speaks to Matthew Cawsey, Director of Product Marketing and Solution Strategy, and Paarijat Bose, Customer Success Manager at Stibo Systems, to discuss sustainable sourcing and why accurate data matters. 

    Key Takeaways:

    Sustainable sourcing involves understanding the provenance and environmental impact of products, ensuring compliance with regulations, and meeting sustainability goals.Data completeness and accuracy are crucial in meeting regulatory requirements and avoiding issues like greenwashing.Managing sustainability data requires a solid foundation of MDM to ensure data accuracy, stewardship, and semantic consistency.MDM solutions help companies collect, manage, and share sustainability data, enabling them to meet compliance requirements and achieve their sustainability goals.

    Chapters:

    00:00 - Introduction and Overview

    01:07 - The Challenge of Collecting Data for Compliance and Reporting

    02:31 - Data Accuracy and Completeness in the Supply Chain

    05:23 - Regulations and the Demand for Transparent and Complete Data

    08:41 - The Role of Master Data Management in Sustainability

    15:51 - How Data Management Technology Solutions Help Achieve Sustainability Goals

    21:02 - The Need to Start Early and Engage with Data Management Solutions

    22:01 - Conclusion and Call to Action

  • Data provenance is essential for maintaining trust and integrity in data management. It involves tracking the origin of data and understanding how it has been processed and handled over time. By focusing on fundamental principles such as identity, timestamps, and the content of the data, organisations can ensure that their data remains accurate, consistent, and reliable.

    Implementing data provenance does not require significant changes or large investments. Existing technologies and techniques can be seamlessly integrated to provide greater transparency and control over data. With data provenance, businesses can confidently manage their data, enhancing decision-making and fostering stakeholder trust.

    In this episode, Jon Geater, Co-Chair of the Supply Chain Integrity Transparency and Trust (SCITT) Working Group, speaks to Paulina Rios Maya, Head of Industry Relations, about data provenance. 

    Key Takeaways: 

    Data provenance is knowing where data comes from and how it has been handled, ensuring trust and integrity.The fundamental principles of data provenance include identity, timestamps, and the content of the data.Data provenance can be implemented by integrating existing technologies and techniques without significant changes or investments.Data provenance helps with compliance, such as GDPR, by providing a transparent record of data handling and demonstrating compliance with requests.

    Chapters: 

    00:00 - Introduction and Background

    02:01 - Understanding Data Provenance

    05:47 - Implementing Data Provenance

    10:01 - Data Provenance and Compliance

    13:50 - Success Stories and Industry Applications

    18:10 - Conclusion and Call to Action

  • FME is a vital tool in disaster management and response. It enables the integration and transformation of geospatial data for real-time tracking of disasters and hazards. By ensuring accurate and timely data analysis, it provides essential decision support for disaster management professionals.

    During the Maui wildfires, FME and the Pacific Disaster Centre were crucial in managing and analysing critical data, allowing for effective coordination and response. By facilitating seamless data sharing and collaboration among stakeholders, FME helps ensure that the correct information reaches the right people at the right time.

    In this episode of the EM360 Podcast, Alejandro Leal, an Analyst at KuppingerCole, speaks to Jorma Rodieck, a GIS Specialist at the Pacific Disaster Centre, about the importance of FME. 

    Key Takeaways:

    FME is an essential tool in disaster management and response, allowing for the integration and transformation of geospatial data.FME enables real-time data analysis and decision support for disaster management professionals.During the Maui wildfires, FME was instrumental in managing and analyzing critical data, providing a common operating picture for response efforts.FME ensures effective data sharing and collaboration among various stakeholders, enabling smooth interoperability between departments and agencies.

    Chapters:

    00:00 - Introduction and Background

    02:35 - The Role of FME in Disaster Management

    06:44 - Managing and Analyzing Critical Data with FME

    10:34 - FME's Impact during the Maui Wildfires

    11:59 - Ensuring Effective Data Sharing and Collaboration

    15:20 - The Future of FME in the Pacific Disaster Center

    18:15 - Conclusion

  • Open source real-time analytics offers unparalleled advantages, providing businesses with freedom and independence to maintain operations seamlessly, even if a vendor issue arises. However, the journey isn't without its challenges. Open source solutions can often be clunky and require specialised expertise to manage effectively. 

    This is where DoubleCloud comes in, offering a managed platform that addresses these obstacles by handling crucial responsibilities such as backups, high availability, and security updates, allowing businesses to focus on leveraging their data.

    In this podcast, Christina Stathopoulos speaks to Vladimir Borodin, Co-Founder and CEO of DoubleCloud, about open source strategies and the advantages of the DoubleCloud solution.  

    Key Takeaways:

    DoubleCloud's managed platform helps overcome the challenges of open source, such as clunkiness and a lack of expertise.Successful customer use cases demonstrate the performance and cost benefits of DoubleCloud's solution.The transition phase to DoubleCloud's solution depends on the complexity of the application.Using open source whenever possible is recommended.

    Chapters:

    00:00 - Introduction and Background

    02:29 - The Advantages of Open Source

    04:21 - Challenges of Open Source

    06:47 - The Power of Real-Time Analytics

    09:11 - Success Stories: Improved Performance and Reduced Costs

    12:54 - Navigating the Transition to DoubleCloud's Solution

    15:14 - The Importance of Using Open Source

  • Privacy by Default and Design is a fundamental principle of the General Data Protection Regulation (GDPR). It prioritises transparency, user control, and data security from the outset. This approach ensures that privacy is integrated into systems and processes by default rather than as an afterthought. 

    By embedding these practices, organisations enhance trust and accountability while meeting regulatory requirements. However, challenges such as resistance to change and the need for cultural transformation must be addressed to implement this principle effectively.

    In this episode of the Don’t Panic It’s Just Data, Tudor Galos, Senior Privacy Consultant, speaks to Paulina Rios Maya, Head of Industry Relations, about the impact of privacy by default and design extend to user experience, where issues like consent fatigue and the necessity for user-friendly interfaces arise. 

    Key Takeaways:

    Organisations face challenges in implementing privacy by default and design, including resistance to change and the need for cultural transformation.Privacy by default and design impact user experience, with issues like consent fatigue and the need for user-friendly interfaces.Regulations like GDPR and CCPA incorporate privacy by default and design principles, emphasising compliance and accountability.

    Chapters:

    00:00 - Introduction and Overview

    01:00 - Core Principles of Privacy by Default and Design

    02:19 - Difference from Traditional Privacy Practices

    04:09 - Challenges in Implementing Privacy by Default and Design

    05:33 - Impact of Privacy by Default on User Experience

    08:14 - Alignment of Privacy by Default with Regulations

    09:04 - Ensuring Compliance and Trust

    11:24 - Implications of Emerging Technologies on Privacy

    13:15 - Innovations in Privacy-Enhancing Technologies

    15:50 - Conclusion

  • Safe Software's Feature Manipulation Engine (FME) plays a pivotal role in the City of Fremont's operations, particularly in ensuring accurate and efficient data submissions under the Racial and Identity Profiling Act (RIPA). By automating complex workflows and enhancing data quality, FME not only ensures seamless compliance with RIPA requirements but also optimises processes for their ITS and GIS divisions.

    FME also drives innovation in projects like the DroneSense programme and their Cityworks asset management integration. With seamless data integration and powerful visualisations, FME empowers the City of Fremont to enhance operations, improve asset management, and support informed decision-making.

    In this episode, Jonathan Reichental, founder at Human Future, speaks to John Leon, GIS Manager for the City of Fremont, to discuss: 

    FME RIPAPublic Safety 

    Chapters:

    00:00 - Introduction and Overview of the City of Fremont and IT/GIS Division

    03:01 - Explanation of the Racial and Identity Profiling Act (RIPA)

    04:27 - Challenges in Meeting RIPA Standards and Utilizing FME

    06:21 - How FME Ensures Error-Free RIPA Data Submissions

    09:40 - Benefits of Using FME for RIPA Compliance

    10:39 - Other Innovative Projects Utilizing FME in the City of Fremont

    13:30 - Future Plans for FME in the City of Fremont

    17:17 - Recommendations for Government Agencies: Leverage FME for Data Submissions

  • Real-time data insights help identify performance bottlenecks, manage data efficiently, and drive innovation. Despite the growing need for these capabilities, organisations often face challenges in implementing effective real-time analytics. 

    Achieving high-concurrency data processing is crucial for overcoming performance bottlenecks in real-time analytics. Embracing real-time analytics is not just a necessity, but a way to transform your data into actionable insights, optimise performance, and fuel business growth.

    Yellowbrick is a modern data platform built on Kubernetes for enterprise data warehousing, ad-hoc and streaming analytics, AI and BI workloads that ensures comprehensive data security, unparalleled flexibility, and high performance. 

    In this podcast, Doug Laney, a Data Strategy Innovation Fellow with West Monroe, speaks to Mark Cusack, the CTO of Yellowbrick, about the power of real-time analytics. 

    Key Takeaways:

    Real-time analytics enables faster business decisions based on up-to-date data and focuses on enabling actions.Using a SQL data platform like Yellowbrick, designed for high-concurrency data processing, can address performance bottlenecks in real-time analytics.

    Chapters:

    00:00 - Introduction and Overview

    01:07 - The Benefits of Real-Time Analytics

    06:23 - Overcoming Challenges in Implementing Real-Time Analytics

    06:51 - High Concurrency Data Processing for Real-Time Analytics

    13:59 - Yellowbrick: A Secure and Efficient SQL Data Platform

  • Accurate and reliable data is essential for training effective AI models. High-quality data ensures precision, reduces bias, and builds trust in AI systems. Similarly,  Master Data Management (MDM) systems enhance data quality by integrating data from multiple sources, enforcing data governance, and providing a single source of truth. This helps eliminate discrepancies and maintain data integrity.

    Integrating Product Information Management (PIM) with MDM ensures accurate and consistent product data across all channels, crucial for data-driven marketing. This combination centralises customer and product data, enabling precise targeting and personalised experiences. MDM and PIM integration leads to higher ROI and improved customer satisfaction by supporting effective marketing strategies.

    In this episode of the EM360 Podcast, Paulina Rios Maya speaks to Philipp Krueger about integrating PIM and MDM functionalities and how it streamlines operations, improves data accuracy and supports data-driven marketing strategies. 

    Chapters

    00:00 - Introduction and Importance of Data Quality in AI Models

    05:27 - Core Capabilities of an MDM System

    08:13 - The Role of Data Governance in Data Management

    13:37 - Enhancing Customer Experience and Driving Sales with Pimcore

    19:47 - Integration of PIM and MDM Functionalities for Data-Driven Marketing Strategies

    22:59 - The Impact of Accurate Data on Revenue Growth

    27:28 - Simplifying Data Management with a Single Platform

  • One of the biggest challenges businesses face when it comes to data visualisation is handling the volume of data and the need for faster processing methods. 

    There's a common misconception that effective data visualisation must be fancy and interactive, but simple visuals can be just as powerful. Ann K. Emery, an expert in the field, believes that accessibility doesn't have to be time-consuming or expensive. 

    In this podcast, she shares actionable strategies for creating accessible visualizations with Paulina Rios Maya, Head of Industry Relations at EM360Tech. 

    Key Takeaways

    Avoiding red-green colour combinations, Ensuring proper colour contrastUsing direct labelling instead of legends. Avoiding using all-capsUsing grey to highlight important informationEmploying small multiples to simplify complex visualisations. 

    Chapters: 

    00:00 - Introduction

    00:54 - Defining Accessibility in Data Visualization

    02:17 - Big A Accessibility Tips

    06:36 - Little a Accessibility Strategies

    12:28 - The Future of Data Accessibility

  • Managing cloud costs effectively has become a significant challenge for organisations relying on public cloud services. FinOps addresses these challenges by ensuring efficient spending and governance of cloud resources. Key practices in FinOps include achieving complete visibility into cloud usage and costs, fostering cross-functional collaboration between finance, operations, and engineering teams, and utilising data-driven decision-making to optimise cloud investments. 

    By embracing a centralised team, organisations can instil a culture of governance and efficiency in cloud cost management. This approach can lead to enhanced resource utilisation and substantial cost savings. With Vantage, your organisation can cultivate a robust cloud cost governance and efficiency culture, ensuring your cloud investments yield maximum value.

    In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US, speaks to Ben Schaechter, CEO and co-founder of Vantage, to discuss: 

    FinOpsVantage’s platform Cloud costs and FinOps practices
    Chapters

    00:00 - Introduction and Overview

    02:02 - Understanding FinOps and Cloud Cost Governance

    07:45 - Best Practices in FinOps: Centralization and Collaboration

    13:50 - The Role of Data-Driven Insights in Optimizing Cloud Costs

  • Managing large volumes of data in the context of AI and machine learning applications presents challenges related to data quality, data preparation, and automation.

    The requirements of data management are changing with the advent of generative AI, requiring more flexibility and the ability to handle larger volumes of data. Pimcore leverages AI and machine learning to automate data utilization and improve data intelligence. 

    By streamlining data management and integrating various data sources, Pimcore drives revenue growth for its customers. The platform combines data management and experience management to deliver personalized data across communication channels. Pimcore’s MDM solution addresses the challenges of integrating data for both human and machine consumption. The choice between physical and virtual MDM hubs depends on the use case and industry. 

    In this episode of the EM360 Podcast, Doug Laney, Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Dietmar Rietsch, Managing Director and Co-Founder of Pimcore, to discuss: 

    Data managementAIMachine learningData quality
  • Maximising data relationships through text analytics, particularly with tools like LLMS and Knowledge Graphs, offers organisations unprecedented insights and capabilities. By leveraging these advanced technologies, businesses can unlock hidden connections and patterns within their data, leading to more informed decision-making and strategic planning. 

    Integrating Ontotext's solutions is a game-changer, empowering organisations to extract, organise, and visualise complex information from unstructured data sources. With Ontotext's expertise in semantic technology, businesses can construct robust knowledge graphs that offer a comprehensive understanding of their data landscape. This comprehensive approach not only facilitates better analysis and interpretation of data but also ignites innovation and propels business growth in today's increasingly data-driven world.

    In this episode of the EM360 Podcast, Paulina Rios Maya, Head of Industry Relations, speaks to Doug Kimball, Chief Marketing Officer at Ontotext, to discuss: 

    AI in Enterprise Knowledge LLMs Knowledge Graphs 

    Chapters

    00:00 - Challenges of Integrating LLMs into Enterprise Knowledge Management Systems

    04:35 - Enhancing Compatibility and Efficacy with Knowledge Graphs

    07:21 - Innovative Strategies for Integrating LLMs into Knowledge Management Frameworks

    11:07 - The Future of LLM-Driven Knowledge Management Systems: Intelligent Question Answering and Insight Enablement

  • Managing cloud computing costs is a pressing challenge faced by organisations of all sizes across industries. As businesses increasingly migrate their operations to the cloud, the complexity of managing and optimizing costs grows exponentially. Without proper oversight and strategy, cloud expenses can quickly spiral out of control, leading to budget overruns and financial inefficiencies. 

    Vantage addresses this issue head-on by providing organizations with a powerful platform equipped with automated cost recommendations, customizable reports, and real-time monitoring capabilities. By leveraging advanced analytics and machine learning, Vantage empowers teams to gain unparalleled visibility into their cloud spending and make informed decisions to optimize costs. 

    In this episode of the EM360 Podcast, Dana Gardner, Principal Analyst at Interarbor Solutions speaks to Ben Schaechter, CEO and Co-founder of Vantage, to discuss: 

    Cloud cost managementFinOpsCost optimizationAutomated cost recommendations 
  • Ensuring the reliability and effectiveness of AI systems remains a significant challenge. Generative AI must be combined with access to your company data in most use cases, a process called retrieval-augmented generation (RAG). The results from GenerativeAI are vastly improved when the model is enhanced with contextual data from your organization. 

    Most practitioners rely on vector embeddings to surface content based on semantic similarity. While this can be a great step forward, achieving good quality requires a combination of multiple vectors with text and structured data, using machine learning to make final decisions. 

    Vespa.ai, a leading player in the field, enables solutions that do this while keeping latencies suitable for end users, at any scale. 

    In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US speaks to Jon Bratseth, CEO of Vespa.ai, to discuss: 

    the opportunity for Generative AI in businesswhy you need more than vectors to achieve high quality in real systemshow to create high-quality GenerativeAI solutions at an enterprise scale
  • Geographic Information Systems (GIS) have transformed urban landscape analysis and government policy creation, albeit not without challenges. In the past, GIS analysts often visited locations to piece together information physically.

    With the help of cutting-edge platforms like Safe Software’s FME, cities like Burnaby, British Columbia, have revolutionised their operations. This has led to a significant enhancement in the quality of life for its residents. From predictive modelling to real-time data analysis, the potential for innovation appears boundless, underscoring the importance of GIS technology in improving urban operations.

    In this episode of the EM360 Podcast, Wayne Eckerson speaks to Herman Louie, GIS Analyst at the City of Burnaby, to discuss: 

    Design and implementation of GIS solutionsSafe Software’s FME platform Transition to NG9-1-1 The future of GIS 
  • Government organisations face a multitude of challenges when it comes to managing their data effectively. From interoperability issues between systems to the need for seamless collaboration across agencies, the complexity can be overwhelming. Safe Software's FME platform offers a comprehensive solution to these challenges by providing a flexible and intuitive data integration platform tailored to the unique needs of government agencies.

    With FME, government organisations can overcome the barriers that hinder efficient data management. FME enables streamlined operations and improved decision-making processes by seamlessly connecting disparate systems and applications. Whether it's digital plan submissions, emergency services coordination, or interagency health data sharing, FME empowers government agencies to achieve their data integration goals with ease.

    In this episode of the EM360 Podcast, Doug Laney, Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Tom Seymour, Government Sales Team Lead at Safe Software, to discuss:  

    Data integration and interoperability Safe Software’s FME platform FME in governments Advantages of FME ROI with FME 
  • Ever wonder how search engines understand the difference between "apple," the fruit, and the tech company? It's all thanks to knowledge graphs! These robust and scalable databases map real-world entities and link them together based on their relationships. 

    Imagine a giant web of information where everything is connected and easy to find. Knowledge graphs are revolutionizing how computers understand and process information, making it richer and more relevant to our needs. 

    Ontotext is a leading provider of knowledge graph technology, offering a powerful platform to build, manage, and utilise knowledge graphs for your specific needs. Whether you're looking to enhance search capabilities, improve data analysis, or unlock new insights, Ontotext can help you leverage the power of connected information.

    In this episode of the EM360 Podcast, George Firican, Founder of LightsOnData, speaks to Sumit Pal, Strategic Technology Director at Ontotext, to discuss: 

    Knowledge Graphs Use Cases Ontotext GraphDBIntegration of AI Industry best practices 
  • The traditional data warehousing landscape is changing. The concept of private data cloud offers a compelling alternative to both cloud PaaS and traditional data warehousing. Imagine a secure, dedicated environment for your data, existing entirely within your organisation's control.

    Yellowbrick, a leader in private data cloud solutions, empowers businesses to leverage their data on their terms. Their Bring Your Own Cloud (BYOC) approach offers unmatched flexibility and control. You can deploy Yellowbrick anywhere your data needs to be—public cloud, private cloud, or even the network edge. This ensures compliance with regulations and keeps your data exactly where you want it and can bring down costs. 

    In this episode of the EM360 Podcast, Wayne Eckerson, President of Eckerson Group, speaks to Mark Cusack, Chief Technology Officer of Yellowbrick, to discuss:

    The need for hybrid multi-cloud data platforms How Yellowbrick differentiatesThe future of private data cloudWhy Yellowbrick?