Long-term Archiving
By: Storage Magazine View more from Storage Magazine >>
Download this next:
Future Standards will be Functional
By:
Type: Replay
Standards today are published in document formats do not lend well to machine readability or automated compliance. In this session, we present a revolution of the understanding of standards - process and data requirements in standards can actually be expressed in a multi-modal model for machine execution.
These are also closely related to: "Long-term Archiving"
-
Shaping Data from Multiple Sources: Filtering, Redaction, and Enrichment
By:
Type: Replay
No matter the industry, data quality is a struggle for anyone managing observability at scale. Even when teams do have the data they need, the format and content of those logs can be a pain (inefficient formats and missing timestamps, anyone?). Join Cribl for an interactive demo, where Ed and Desi will teach you: • Reveal common data quality issues, including inefficient formats and missing timestamps. • How to build an observability pipeline to normalize data from multiple sources as it comes in, increasing flexibility around what you can analyze. • Show how you can enrich logs in-flight with GeoIP information and more for additional context and control. • Discuss three more strategies to standardize and clean up your events, opening up more options for your data.
-
Vendor Agnostic Instrumentation With The OpenTelemetry Collector
By:
Type: Video
Achieving the best possible observability involves instrumenting your code with events and traces. This can be costly, and it's common for a large system to have components instrumented with different formats specific to different standards or vendors. In this session, I'll walk through using the OpenTelemetry Collector to combine traces and metrics from services instrumented with different formats. Use cases covered: * Interoperability between OpenTracing, OpenCensus, and other formats. • Using processors to modify spans • Migrating from one tracing solution to another without rewriting code • Sending traces to two separate backends
Find more content like what you just read:
-
Friday Flows Episode 6: Normalize Alerts with ChatGPT
By:
Type: Video
The strides in GenAI have been remarkable this year, but we're all still trying to figure out how to impact our day-to-day work. In this demo, we use AI in the best way we know how to at Tines: by speeding up a security analyst's work and making their life a little easier! Use ChatGPT to normalize alert formats, in this case from CRWD. Alerts from multiple sources are converted into a standard format for easier processing by a SOC, and a ticket is then created.
-
Managing your media: Which codec for archive and library?
By:
Type: Talk
Join IBC365 on the 14th April at 2pm as they look at a vital aspect – and potential pain point - of the content supply chain: choosing the best codec for mastering and storing your media. If an efficient content supply chain is to be the key to managing content across multiple platforms and formats, particularly in the cloud, making the right choice of content format and video encoding is vital. During the discussion, we will explore the available options for making sure your library and archive content is kept in the most suitable format, including: - The role of mezzanine codecs - When to consider an open standard - The pros and cons of proprietary formats - How to future-proof your archive - Content management and playout considerations - Balancing video quality with storage , transcoding and bandwidth costs
-
SBOMs and SPDX: Now and in the Future
By:
Type: Replay
If software is an important part of your business and you need to comply with license terms and protect against security vulnerabilities, you need to know and track what is inside your software. Lists of software components and dependencies are typically referred to as Software Bills of Materials (SBOMs). Standardizing the format for SBOMs can improve the accuracy and efficiency for managing software license compliance and security vulnerabilities – especially if your software is the result of a long list of suppliers (e.g., a commercial product which depends on a commercial library which uses an open-source library which includes source from a different open-source project). With the May 12, 2021 U.S. Presidential Executive Order on Improving the Nations Cyber Security, several software suppliers are being required to produce their SBOMs in a standard format. SPDX is a standard format for SBOMs. Although it has been around for more than 10 years, it has gone through some significant evolution such representing data on security vulnerabilities. There is a forthcoming major release which supports several new use cases such as tracking the build process and tracking data about artificial intelligence models. In this talk, we will start with how you can use the widely adopted SPDX 2.3 spec to represent security vulnerability and license compliance data and then go into some of the new features of the SPDX 3.0 specification. We will touch on what goes into making a quality SBOM. At the conclusion of the talk, you will have a better understanding how you can make use of SPDX whether you are producing software or evaluating software you use (or plan to use).
-
AREA Research Committee Webinar: Introduction to Khronos Group Camera API
By:
Type: Talk
Overview The AREA Interoperability and Standards program produces thought leadership content to increase awareness and provide concrete ways that all members of the AR ecosystem can use standards to integrate AR solutions in scalable ways. The Khronos Group is an open industry consortium of close to 200 leading hardware and software companies creating advanced, royalty-free, acceleration standards for 3D graphics, Augmented and Virtual Reality, and parallel computation. This session will provide the latest updates on Khronos interoperability standards relating to AR, VR and the metaverse, including OpenXR, glTF and the new Camera Working Group. After a presentation there will be a Q&A session in which the audience can ask questions about any aspect of Khronos standardization activities. Topics Topics to be covered will include: - An overview of Khronos activities for AR - Key features and adoption of OpenXR - glTF asset format ecosystem and roadmap - Evolving glTF into a portable metaverse format - The goals and direction of the new Khronos Camera working group - How to get involved!
-
What the heck is an SBOM
By:
Type: Video
In this episode, Matt uses the analogy of America’s beloved boxed mac n’ cheese to define what a software bill of materials (SBOM) is and should be. He then points out that when making SBOMs, organizations should look to approved and standardized SBOM formats for them to be as clear and transparent as possible.
-
Tealium + Meta: How Conversions API & EMQ Can Improve Your Digital Operations
By:
Type: Talk
Tealium and Meta are partnering to discuss how advertisers can prepare for a cookie-less future in a privacy-compliant way. This webinar offers a refresher on how to use Tealium’s Conversions API integration with Meta and deep dives on how to improve that connection using your EMQ score. We will also discuss how to share data responsibly and review Meta’s terms for data sharing using Conversions API.
-
Interferometric Light Microscopy for Rapid Virus Titering and Characterization of Lipid Nanoparticle Preparations
By:
Type: Talk
Significant advancements in upstream biomanufacturing methods for cell and gene therapies have been achieved in recent years. However, final yields continue to be impacted by losses in downstream steps. Interferometric Light Microscopy (ILM) represents a fast and cost-effective method for process development teams to obtain size and concentration data within seconds. This presentation will cover the technical basis of ILM as it compares to other methods for monitoring nanoparticles. Experimental studies developing ILM utilization in characterizing virus preparations, detecting virus breakage and aggregation, and monitoring lipid nanoparticle preparations will be discussed.
-
New Privacy Technologies for Unicode and International Data Standards
By:
Type: Talk
Protecting the increasing use International Unicode characters is required by a growing number of Privacy Laws in many countries and general Privacy Concerns with private data. Current approaches to protect International Unicode characters will increase the size and change the data formats. This will break many applications and slow down business operations. The current approach is also randomly returning data in new and unexpected languages. New approach with significantly higher performance and a memory footprint can be customizable and fit on small IoT devices. Unicode is an information technology standard for the consistent encoding, representation, and handling of text expressed in most of the world's writing systems. Unicode can be implemented by different character encodings. The Unicode standard defines Unicode Transformation Formats (UTF) UTF-8, UTF-16, and UTF-32, and several other encodings. UTF-8, the dominant encoding on the World Wide Web (used in over 95% of websites as of 2020, and up to 100% for some languages) and on most Unix-like operating systems. We will discuss new approaches to achieve portability, security, performance, small memory footprint and language preservation for privacy protecting of Unicode data. These new approaches provides granular protection for all Unicode languages and customizable alphabets and byte length preserving protection of privacy protected characters.
-
NetOps 101: Part 1 - Introduction to NetOps
By:
Type: Talk
Joins us in our NetOps 101 multi-part series as we go back to the beginning to understand the formation of NetOps, why it was needed, and the models, frameworks and standards that it was built upon that are still inherit in how we monitor network architectures today. Broadcom is collecting your personal data when you submit such information as part of the BrightTALK registration process. Your personal data is processed according to Broadcom's Privacy Policy: https://www.broadcom.com/company/legal/privacy/policy When you interact with Broadcom, this serves as your authorization to BrightTALK to provide your contact information to Broadcom in order for Broadcom to follow up on your interaction.
-
Creating Great-Looking PDFs From XML DITA
By:
Type: Talk
Join us for this presentation from Joanne Hannagen & Corinna Kinchin of Datazone, the developers of MiramoPDF, the intuitive, GUI-driven PDF-formatter for structured content (including DITA). You'll learn how information developers around the globe use MiramoPDF to overcome formatting challenges that often accompany the output of PDF files from structured XML content (like DITA). You'll see a quick demonstration of the tool in action and be able to ask questions of the MiramoPDF expert. About MiramoPDF: Easily integrated with any content management system or content creation and editing environment, MiramoPDF delivers all the formatting power and sophistication in one-tenth the time of programming-based formatters. MiramoPDF is standards-based and used globally in the financial, manufacturing, government, and healthcare sectors.
-
Amazon S3 API - Advanced Features
By:
Type: Video
The Simple Storage Service (S3) from Amazon Web Services (AWS) has been established as the standard interface for object storage and has been adopted as the de facto by the storage industry. The Amazon S3 API contains many advanced features that enterprises expect in every storage platform. Join us in this webinar to look at how these features are implemented and how object storage solution providers can extend the S3 API to add even more value to on-premises deployments. In this webinar you will learn about: Versioning - the ability to retain a historical record of object updates Tiering and ILM - moving object data through a data lifecycle Replication - protecting data across multiple data centres/regions Logging and Billing - tracking and charging for resource usage.
-
Handling UHD and HDR
By:
Type: Talk
Increasingly, production in UHD and HDR is becoming the ‘gold standard’ across vast areas of M&E. Whilst UHD can undoubtedly yield impressive results for viewers with sufficiently large and high-end screens, it is HDR that is often delivering the biggest ‘wow factor’ for content ranging from natural history factual to lavish costume dramas. And with many streaming services now stipulating UHD and HDR deliverables, the onus is on content creators to enable work in these formats as fluidly as possible. Updating workflows to deal with these new requirements can be a complex and costly exercise, and in an era of greatly increased competition it’s even more important to make the right choices. From on-set capture and post through to distribution and storage, it’s vital that content creators have the kit they need to establish effective and efficient workflows. In this IBC 365 webinar, a panel of industry experts will discuss the key issues informing the development of UHD and HDR workflows, including: - The extent to which UHD and HDR require a complete renovation of existing post workflows - Whether it’s possible to undertake a meaningful phased migration to UHD and HDR - Requirements of the primary equipment in post for handling these formats - Expectations of how the cloud will shape UHD and HDR production in future, including for post and subsequent storage of these memory-intensive formats - Whether UHD (4K)/HDR is likely to be the gold standard for many years to come – or will there also need to be mass support for 8K in the near-future?
-
Migrate VMs to KVM: A how-to guide
By: StorPool
Type: eBook
Read this detailed Migrate VMs to KVM: A how-to guide to understand the benefits and process of migrating VMs from common hypervisors to cloud-first stacks powered by KVM. It includes a detailed step-by-step process and a Command List.
-
Beyond Lakehouse Table Formats: The original creators of Delta Lake and Apache Iceberg™ take on interoperability
By:
Type: Video
Choosing the best unified platform for data, analytics and AI is easy — it’s lakehouse. Choosing the right open table format for your lakehouse? Not as easy. For most organizations, it’s a daunting decision that delays lakehouse adoption. The holdup hurts your ability to capitalize on analytics and AI. What if you didn’t have to choose a format? Join us for a conversation about interoperability with Michael Armbrust, original creator of Delta Lake, and Ryan Blue, an original creator of Apache Iceberg. They’ll discuss the state of open table formats and how Databricks is solving interoperability.
-
Beyond Lakehouse Table Formats: The original creators of Delta Lake and Apache Iceberg™ take on interoperability
By:
Type: Video
Choosing the best unified platform for data, analytics and AI is easy — it’s lakehouse. Choosing the right open table format for your lakehouse? Not as easy. For most organizations, it’s a daunting decision that delays lakehouse adoption. The holdup hurts your ability to capitalize on analytics and AI. What if you didn’t have to choose a format? Join us for a conversation about interoperability with Michael Armbrust, original creator of Delta Lake, and Ryan Blue, an original creator of Apache Iceberg. They’ll discuss the state of open table formats and how Databricks is solving interoperability.
-
Demo: Getting Started With M-22-09
By:
Type: Replay
M-22-09 sets forth a Federal zero trust architecture strategy requiring agencies to meet specific cybersecurity standards and objectives by the end of FY2024, including encryption of all DNS requests and HTTP traffic in their environments and unleveling practices on sensitive data monitoring, data categorization, and information sharing. In this demo, we’ll show you how to: - Easily route data to multiple destinations via a data pipelining engine between data sources and destinations - Transform data into any format or protocol for secure sharing - Encrypt DNS requests and HTTP traffic, redact or mask sensitive data, and enrich logs with relevant categorization information
-
Tealium + Meta: Inside Tealium’s Unique Integration with Meta Conversions API
By:
Type: Talk
The Tealium + Meta Better Together story isn’t new, but the ways brands are leveraging our partnerships are. With 175 successful customer deployments, Tealium’s integration with Meta’s Conversions API (CAPI) and Event Match Quality (EMQ) is the key to achieving unified client and server-side data. Get an exclusive look at how brands like TUI use Tealium's turnkey partnership with Meta to: - Increase conversion rates across multiple channels - Significantly improve return on ad spend - Reduce acquisition costs
-
Structuring Your Data for Success: Data Management Policies
By:
Type: Video
Discover how robust data management policies are essential for AI adoption and compliance. Explore Information Lifecycle Management (ILM), business continuity, disaster recovery, and how these processes ensure data-driven success.
-
Managing the Lifecycle of Your Software Bills of Materials (SBOMs)
By:
Type: Video
Dmitry Raidman’s foray into SBOM management started with a vulnerable baby monitor when he was a new father in 2015. An SBOM – Software Bill of Materials – is like an ingredient list of all the pieces of code that go into an embedded application, he explains. With hundreds to thousands of SBOMs applied to each software product, he founded Cybeats and built the SBOM Studio. More than just a repository, SBOM studio automates and orchestrates SBOM management and visualization across a large variety of SBOM types to provide lifetime management of SBOM information, and greatly enhance visibility in the software supply chain. He explains the many types of SBOMs, starting with a design SBOM to understand if you’re bringing reputable sources of code into the application before development starts. He points to the repository SBOM, the build SBOM, the binary SBOM, the runtime SBOM and more. An SBOM repository must handle any type of SBOM, he adds, regardless of competing standards. “There are different companies generating different SBOMs, but you want a company that really does it well by actually identifying every single component properly,” he explains. “A quality SBOM can provide analytics around the SBOMs to add value for product builders and customers.” In this show, he also demonstrates how builders and buyers can use SBOM Studio to generate a SBOM from an open-source application using CodeSecure’s CodeSentry binary composition analysis. This is the result of a partnership between CodeSecure and Cybeats announced in October. In the demonstration, the O/S, version, format, license warnings, and other meta data are analyzed against a data lake of known vulnerability and supply chain intelligence data. It is then further narrowed down to actionable vulnerabilities through application of the Known Exploitable Vulnerabilities (KEV) catalog and other information sources to prioritize and analyze “breachability,” as he says.
-
Managing the Lifecycle of Your Software Bills of Materials (SBOMs)
By:
Type: Replay
Dmitry Raidman’s foray into SBOM management started with a vulnerable baby monitor when he was a new father in 2015. An SBOM – Software Bill of Materials – is like an ingredient list of all the pieces of code that go into an embedded application, he explains. With hundreds to thousands of SBOMs applied to each software product, he founded Cybeats and built the SBOM Studio. More than just a repository, SBOM studio automates and orchestrates SBOM management and visualization across a large variety of SBOM types to provide lifetime management of SBOM information, and greatly enhance visibility in the software supply chain. He explains the many types of SBOMs, starting with a design SBOM to understand if you’re bringing reputable sources of code into the application before development starts. He points to the repository SBOM, the build SBOM, the binary SBOM, the runtime SBOM and more. An SBOM repository must handle any type of SBOM, he adds, regardless of competing standards. “There are different companies generating different SBOMs, but you want a company that really does it well by actually identifying every single component properly,” he explains. “A quality SBOM can provide analytics around the SBOMs to add value for product builders and customers.” In this show, he also demonstrates how builders and buyers can use SBOM Studio to generate a SBOM from an open-source application using CodeSecure’s CodeSentry binary composition analysis. This is the result of a partnership between CodeSecure and Cybeats announced in October. In the demonstration, the O/S, version, format, license warnings, and other meta data are analyzed against a data lake of known vulnerability and supply chain intelligence data. It is then further narrowed down to actionable vulnerabilities through application of the Known Exploitable Vulnerabilities (KEV) catalog and other information sources to prioritize and analyze “breachability,” as he says.
-
Streamlining Quality Control Delivery with Venera Technologies
By:
Type: Talk
How, if at all, does Quality Control help you deliver more, better, and faster? QC might sound scary or boring, yet a cloud-based solution can instantly and automatically flag errors, adjust formatting for changing delivery standards, and more. Attend this webinar to learn how to: • Choose a QC strategy that works for your whole team • Use QC tools to flag content errors in even the fastest-paced workflow • Adopt best practices for efficiency in rising and falling workloads
-
The next generation of internal audit: How Meta is innovating with technology | 2024 The MindBridge Conference
By:
Type: Video
Join us for an engaging fireside chat with Badal Patel, Head of Data Analytics and Innovation at Meta, as we delve into how the tech giant is revolutionizing its internal audit processes. This session will explore the strategies and technologies Meta is deploying to transform its audit function into a forward-thinking, value-driven powerhouse. Discover how Meta leverages cutting-edge tools like Artificial Intelligence to boost audit efficiency, generate strategic insights, and maintain robust risk management in a rapidly evolving digital landscape. Gain valuable perspectives on the future of internal auditing and practical takeaways for innovating your own audit practices. Learning Objectives 1) Understand Innovative Technologies in Internal Audit: Gain insights into how Meta is integrating AI and advanced analytics into their internal audit processes to enhance accuracy, efficiency, and risk assessments. 2) Explore Proactive Auditing Strategies: Learn about Meta’s approach to shifting from a reactive to a proactive audit function, including methods to identify and address potential risks before they escalate. 3) Apply Best Practices to Your Organization: Discover actionable strategies and best practices from Meta’s experience that you can apply to innovate and improve your own internal audit processes.
-
From Passage to Standards: The Financial Data Transparency Act of 2022
By:
Type: Talk
The FDTA is here, and time is limited to develop financial system-wide standards! • The FDTA represents the biggest step forward in US financial system standardization in over a decade, and provides an opportunity for financial institutions and regulators alike • Standardization requested by the FDTA will reduce cost and risk while improving efficiency and accountability across every financial sector • Poorly conceived or implemented standards can devolve into a burden and distract from real sources of systemic and institutional risk • Stakeholders should start collaborating now to carefully design standards, set processes, and gain consensus on standards the FDTA calls for, will be available and “fit for purpose” within two years OMG proudly provides this briefing and discussion, that will enable your organization to participate in this work which will convert FDTA from a mandate into appropriate standards fit for a future financial service industry and its regulators. Webinar format: • Welcome & Short Intro: David Blaszkowsky (Managing Director, Financial Semantics Collaborative) • Overview of the FDTA: Michelle Savage (VP of Communications, XBRL US) • A Former Regulator's View on How Regulators Will Approach the FDTA Standardization Mandate: Linda Powell (Enterprise Data CDO, BNY Mellon; former CDO of OFR and CFPB and Chief of Economics Data at the Federal Reserve) • What the FDTA & Standardization Mean for Financial Technology Companies: Sanjeev Kumar (CTO, Global Financial Services & Strategy Lead for Data Management, Dell Technologies) • An OMG View of How Standards Are Created and By Whom: Mike Bennett (Technical Director of OMG SDO) OMG works with government, industry, academia and consortia by providing standards development methodologies and governance, including such financial standards as the Financial Industry Business Ontology (FIBO), the Financial Instrument Global Identifier® (FIGI®) and developing the Standard Business Report Model™ (SBRM™).
-
Maximize Your Marketing Performance with Tealium & Meta
By:
Type: Video
Hear from Todd Pasternack, Head of Ads Partnerships, Americas at Meta and Tealium’s Jenna Fair, Senior Director, Global Technology Partnerships as they share key strategies for how the most innovative businesses are leveraging the power of Tealium + Meta to unify client and server-side data to increase conversion rates, reduce acquisition costs and optimize best practices.
-
Maximize Your Marketing Performance with Tealium & Meta
By:
Type: Replay
Hear from Todd Pasternack, Head of Ads Partnerships, Americas at Meta and Tealium’s Jenna Fair, Senior Director, Global Technology Partnerships as they share key strategies for how the most innovative businesses are leveraging the power of Tealium + Meta to unify client and server-side data to increase conversion rates, reduce acquisition costs and optimize best practices.
-
The Five Dimensions of Content Standardization
By:
Type: Talk
Come join Global content strategy expert Val Swisher, CEO of Content Rules, as she introduces the Five Dimensions of Content Standardization™. Learn about content standards — what they are, and why they're essential to delivering exceptional content experiences. You'll learn how Content Rules' content standardization framework makes content FAIR (findable, accessible, interoperable, and reusable). And, you'll leave knowing the steps you can take to prepare your content for the future. When you document and enforce standards across all Five Dimensions, your content becomes seamlessly reusable. You also reduce risk, time-to-market, and cost while increasing content quality in all languages. The Five Dimensions framework enables you to: 1. Automate content generation, assembly, formatting, and publishing 2. Reuse content in any output type or file format 3. Deploy artificial intelligence to gain actionable insights 4. Exchange information directly with partners, CROs, sites, and regulatory agencies without locking it into documents first 5. Increase translation quality while significantly reducing cost and turnaround time About Val Swisher About the presenter: Val Swisher is the founder of Content Rules, where she leads a cross-disciplinary team of content experts and has predicted several important content trends. She runs the company's content strategy, global content, and content optimization services. Val has more than two decades of experience and is a well-known expert on global readiness, intelligent content, and technology solutions. She frequently speaks at industry conferences and is a sought-after guest on webinars and podcasts. She believes content should be easy to read, cost-effective to translate, and efficient to manage. Val is the author of several books, including The Personalization Paradox (XMLPress) and Global Content Strategy: A Primer (XML Press).
-
Building Network Automations: How to Get Started Step-by-Step
By:
Type: Talk
Getting started with automating network infrastructure requires a logical, step-by-step approach. You should start simple with a relevant use case and translate the process into a series of logical tasks. Then, you can build out integrations and surrounding processes to ensure it works with your infrastructure and meets your standards. This webinar demonstrates how to translate this process into practice. In the second part of this demo series on how to build automated workflows with Itential, Rich Martin, Director of Technical Marketing at Itential, will show you step-by-step how to: • Build pre-check and post-check tasks into a network automation. • Parse and evaluate result data from previous tasks. • Integrate with IT systems to access data for the automation (IPAM). • Format and translate data between tasks using Data Transformations. • Run and test the workflow as part of the building process.
-
Hitachi HCP Removes Data Silos
By:
Type: Video
Accident Exchange, part of the AIS group, is a provider of mobility services for people that experienced a car accident. With large volumes of data in various formats to maintain the company recognized the potential compliance challenges ahead. Accident Exchange in partnership with Hitachi Data Systems implemented Hitachi Content Platform (HCP) providing them the capability to more securely store and abstract data. Learn how Accident Exchange can now prove authenticity, automate and abstract data from different sources in multiple files and formats all within one platform to achieve greater compliance. HCP is the key to removing data silos.
-
AV1 and streaming codecs
By:
Type: Talk
In the four years since its initial release, AV1 has become an industry buzzword. Conceived by the Alliance for Open Media as a successor to VP9, AV1 is a royalty-free video coding format geared towards streaming and other internet applications. Benefiting from very good efficiency of compression and bandwidth usage, AV1 has achieved traction with some world-leading streamers – including Netflix and YouTube – although adoption elsewhere has been more mixed. In this webinar we will look at the benefits of AV1 and its prospects in media streaming and elsewhere, with discussion of aspects including: - Efficiency and performance improvements when compared to earlier formats, including VP9 and HEVC - Ability of AV1 to send streams cheaper and faster, and the impact this is having on the capacity of streaming services - Limitations of AV1 and how these might be addressed in future iterations of the format - Notable commercial implementations of AV1, and the general outlook for the format in streaming worldwide - The features that are likely to be required of future coding formats for streaming.
-
How can data & analytics help IROs prioritize ESG reporting & other disclosures?
By:
Type: Talk
The demand for ESG disclosures from companies has grown exponentially in recent years – as has interest from investors and other stakeholders in more granular data around the impact an organization has on the wider world around it. As regulators move toward standardizing sustainability-related disclosure standards with the formation of the International Sustainability Standards Board, the requirements of IROs and ESG teams in preparing data and disclosures will only become more demanding. In this webinar, hear from leading IROs and corporate advisers about how best to use your company’s data to communicate its ESG strengths and tell the stories at its core, reaching all stakeholders in the way that best suits them. In this webinar, you will hear: • How ESG disclosures are becoming more standardized and aligned • How to use data to communicate your ESG strengths and tell your best stories • Best practices for refining your approach to ESG reporting and disclosures for companies at every stage of familiarity • How tracking and analyzing your sector and industry peers can help to further improve your own approach to disclosure • Advice for how to balance your resources against the growing demands of market participants. IR Magazine has partnered with Broadridge to deliver you this webinar. IR Magazine and Broadridge will process your personal data for purposes connected with your attendance at the webinar and may send you marketing communications and information that may be of interest to you, as permitted under applicable law. You may choose to unsubscribe at any time by email with ‘unsubscribe’ in the subject line or by clicking on the “unsubscribe” link in any email from us or Broadridge. Your data will be processed in accordance with IR Magazine’s privacy policy: www.irmagazine.com/privacy and Broadridge 's privacy policy: www.broadridge.com/intl/legal/privacy-statement-english
-
How to unlock the true value of data
By: TechTarget ComputerWeekly.com
Type: eGuide
With a robust data architecture in place, a firm's data science team can turn raw data into business insight. We take a quick look at how this can be done.
-
Data Preparation Essentials for Automated Machine Learning
By:
Type: Video
In order to run successful machine learning projects, and create highly-accurate predictive models for your business, you need effective data preparation. Although machine learning automation provides safeguards to prevent common mistakes, you’ll still want to correctly prepare, shape and format your data to generate optimal models. In this on-demand webinar, Jen Underwood, Founder of Impact Analytix reviews how to organize data in a machine learning-friendly format that accurately reflects the business process and outcomes. She shares basic guidelines, practical tips, and additional resources to help get you started mastering the essence of predictive model data preparation.
-
Overcome the three big Bs in data management with data-fitness
By: Hitachi Pentaho
Type: White Paper
In this white paper, you'll discover how to become "data-fit" and overcome the challenges of data management. Read on now to learn about the three Bs - bottlenecks, boredom, and business requests - and how to address them with a modular, scalable data platform.
-
Measure your data-fitness with a modular data platform
By: Hitachi Pentaho
Type: Infographic
In this infographic, you'll learn how you unlock the power of your data with Pentaho Data Optimizer. You'll also discover how to automate data ingestion, curate any format, and scale seamlessly - all while maintaining data governance and compliance. Read on now to find out how your organization become data-fit.
-
How Meta Drives Business Outcomes Through Certifications & Training
By:
Type: Talk
Discover how Meta has transformed its certification and training programs to drive substantial business outcomes. Starting from scratch, Vikas and his team developed a comprehensive approach that integrates job role-based and domain-specific certifications with flexible training options. Learn how Meta aligns its educational initiatives with strategic business goals and measures success through key metrics. In this session, you'll gain insights into Meta's journey, best practices, and the critical role of certifications and training in achieving business impact. Key Takeaways: - Discover how metric measurement evolves as your program matures - Learn how to design education content with business impact in mind - Gain insights into the continuous improvement process based on data-driven feedback and performance against business outcomes
-
Automating Quarterly Reviews at Algolia with dbt Cloud, Redshift, and Hex
By:
Type: Video
Learn how to connect your data from Hex to Google Sheets and Google Slides so that stakeholders have up to date information for the KPIs they care about in the format they are comfortable with. This event was live on August 10, 2023.
-
Formatting charts in PowerPoint Brand Genie
By:
Type: Video
Formatting charts in PowerPoint Brand Genie
-
Formatting tables in PowerPoint Brand Genie
By:
Type: Video
Formatting tables in PowerPoint Brand Genie
-
2nd PMI Mentoring and Development Programme - Sponsored by the People's Pension
By:
Type: Talk
Webinar to highlight the requirements and structure of the 2nd PMI Mentoring and Development Programme - Sponsored by The People's Pension and in conjunction with the ILM
-
How to develop a data-driven content strategy
By: BrightTALK by TechTarget
Type: Webcast
In the second episode of the Content Intelligence Series, Nick Markwith will explore how to start developing a data-driven content strategy that produces results from an analytical perspective.
-
Best practices for enabling industrial DataOps
By: HighByte
Type: White Paper
Industrial DataOps is the dominant framework for mastering 4.0 data transformation projects, and it is key for leveraging solutions that can deliver data to users for a real-time view of the enterprise. Read on to learn about a solution that can help manage data in a common format that is ready to consume, contextualize, and scale for the customer.
-
The benefits of switching from worksheets to workflows
By: Alteryx
Type: eBook
If you're working in spreadsheets to do analysis, 90% of your workday is taken up by menial tasks you'd rather not be doing. Formatting data sources. Cleaning and parsing. Applying formulas.
-
Focus: PCIe SSD, NVMe and flash
By: TechTarget ComputerWeekly.com
Type: eGuide
PCIe SSD cards fit straight into PCIe slots in servers and array hardware and often brings much higher performance than traditional HDD-format flash drives..
-
#IMOS22 Securing the Metaverse! The Cybersecurity Industry’s Call to Action
By:
Type: Talk
It’s official - the metaverse is coming. The company formerly known as Facebook is so committed to the idea that it’s renamed itself “Meta” to symbolize its dive into this new online world. Worryingly, despite the exciting possibilities the metaverse offers, there are substantial concerns around safety, privacy and security. Could the metaverse provide forums for misinformation and manipulation? How will companies such as Meta handle sensitive information? Join this panel discussion in which cybersecurity experts explore the challenges around the metaverse, from verification and the dangers of impersonation, to biometric information and how private data should be collected and used.
-
ISO 20022 Migration Webcast
By:
Type: Talk
This past year has been filled with learnings, challenges and accomplishments related to the ISO 20022 migration. The Treasury Services team can help your transition experience by sharing the latest updates and best practices as the U.S. market prepares for a co-existence of CHIPS ISO 20022 message formats and Fedwire legacy formats until March 10, 2025.
-
How to paste and format tables
By:
Type: Video
How to paste and format tables using the Word Brand Genie ribbon.
-
3 ways to fast-track your data lake strategy without being a data expert
By:
Type: Video
IT and security teams are drowning in data. Storing it all to meet compliance regulations and for future analysis can be a challenge, especially with limited resources and tight budget constraints. Data lakes often become a data swamp — a mere dumping ground for data. And all in messy, disparate formats. How can teams get value out of their data once it’s stored? Join this webinar to learn how to get a data lake up and running quickly, without the burden of complex setup and management. A managed data lake can help to: - Easily get data in, and get data out - Embrace open formats to ensure long-term data retention while avoiding vendor lock-in - Keep data secure, prevent unauthorized access, and encourage data sharing And the best part? You don’t have to be a data or cloud expert. Ready to work smarter, not harder with your data lake? Tune in today.