In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.
Indico Data Unveils Indico 5, Major Release of its Unstructured Data Platform
Indico Data, the unstructured data company, unveiled Indico 5, a major release of its AI-powered Unstructured Data Platform. Indico 5 addresses the rapidly growing market demand for software solutions that drive efficiency and accelerate automation and intelligent document processing (IDP) initiatives using unstructured data.
“Indico 5 is another major advance in our strategy of putting game-changing AI solutions for unstructured data in the hands of business users,” said Tom Wilde, CEO of Indico Data. “The real promise of automation in Indico 5 is using AI to augment human expertise, not replace it. The rapid evolution of workforce environments, where remote and hybrid working have shifted employee experience expectations, is also forcing businesses to rethink investments that improve accessibility and use of enterprise data, to increase productivity. We’re delivering that exceptional value to our customers.”
PlanetScale Rewind Is the Database Industry’s First “Easy Button” to Quickly Undo Bad Schema Migrations With Zero Data Loss
PlanetScale, the serverless database powered by MySQL and Vitess, announced PlanetScale Rewind, an “Easy Button” to undo schema migrations that enables users to recover in seconds from changes that break production databases. PlanetScale Rewind lets users almost instantly revert changes to the previous healthy state without losing any of the data that was added, modified, or otherwise changed in the interim.
“PlanetScale Rewind gives users the ability to undo a bad migration just as easily as you undo a typo,” said PlanetScale Vice President of Engineering Nick Van Wiggeren. “So, instead of having to go restart the entire process, imagine if you could just click a button – like Control-Z – and get it back to where it was when you started but, notably, without losing a byte of data. This has never been done before, in any context, in any language, with any other database in the world.”
Kyligence Launches Managed Services Offering for its Cloud Data Analytics Solutions
Kyligence, originator of Apache Kylin and developer of the AI-augmented data services and management platform Kyligence Cloud, announced a new remote managed services offering. Kyligence Managed Services provide enterprise organizations automated operation and maintenance tools, consultancy from experienced product experts, 99.9 percent SLA guarantee and 24/7/365 support.
“Enterprises need a fast pace of technological innovation to respond to the agile development of digital transformation initiatives, however, big data solutions are very diverse and the learning threshold is generally high,” said Silas Ge, vice president of customer success, Kyligence. “The right IT talent can be difficult to find and very costly. Kyligence Managed Services helps enterprises lower operational labor costs while providing a higher level of service.”
YugabyteDB 2.13 Delivers Breakthrough Developer Experience and Performance Improvements
Yugabyte, a leading open source distributed SQL database company, announced the general availability of YugabyteDB 2.13. The latest version of YugabyteDB delivers better control over where data is stored and accessed in geo-distributed deployments, allowing enterprises to lower data transfer costs, improve performance, and ensure compliance with regulatory requirements. The release also includes new features and integrations to enhance the developer experience.
YugabyteDB offers developers, architects, and database operators a rich set of deployment and replication options in geo-distributed environments, allowing them to build enterprise-ready applications that match their business needs and locations. YugabyteDB 2.13 extends the geo-distribution capabilities of the database with new features that enhance performance, increase control over backups, and intelligently utilize local data for reads.
“For enterprises to deliver on the promise of providing high-quality consumer-facing applications, it’s more important than ever to enhance the developer experience so they can be productive and have more time to focus on true, value-add features,” said Karthik Ranganathan, co-founder and CTO, Yugabyte. “Our mission is to deliver the most developer-friendly database. These latest updates will help simplify coding by offloading and automating key functions at the data layer and improve the developer experience by delivering easy, interactive training and greater access to preferred developer tools.”
Talend Data Catalog 8 Delivers Smart, Fast, and Flexible Data Compliance
Talend, a global leader in data integration and data governance, announced the availability of Talend Data Catalog 8, an automated data catalog providing world-class proactive data governance capabilities that enable organizations to discover, organize, and share trusted data with a single, secure point of control. Talend adds new capabilities with this update, including tailored business modelling and machine learning-powered data classification and documentation. The latest release enables organizations of any size to persistently govern data at scale and ensure the health of the data being used to drive business outcomes.
“Reliance on data has grown exponentially in recent years and businesses today need greater control and trust of their data to ensure it is healthy for decision making and operations,” said Daniel Mayer, Vice President, Product Management, Talend. “We’re excited to announce the availability of our newest Data Catalog release, which will help companies implement a proactive approach to governance, to accelerate productivity and help them quickly achieve value from their data.”
StarTree Expands Deployment Options for Apache Pinot Platform with General Availability of Bring Your Own Cloud and Preview of SaaS Edition
StarTree, Inc. announced the general availability of the Bring Your Own Cloud (BYOC) edition and preview of the SaaS edition of StarTree Cloud – a fully managed Apache Pinot platform. Apache Pinot has been proven at scale by LinkedIn, Stripe, Uber, Walmart, DoorDash, WePay and many other companies. StarTree Cloud enables enterprises to ingest, store (with decoupled storage and compute), and query large volumes of data for their user-facing and real-time analytical applications that require millisecond response times and high throughput. StarTree Cloud powers customers in retail, media, food delivery, fintech, and several other industries. Pluto TV, Guitar Center, and Just Eat Takeaway.com are some of the customers using StarTree Cloud.
“Real-time analytics – once a “nice to have” – has become critical for organizations and their external stakeholders. Companies have a remarkable opportunity to provide insights and reinvent user experiences,” said Rohit Agarwalla, Head of Product at StarTree. “With the BYOC and SaaS editions, StarTree Cloud is a multi-cloud ready solution that gives customers the option to pick a fully-managed Apache Pinot deployment model based on where they are in their data journey, allowing teams to focus on building and scaling interactive real-time analytics applications.”
Domo Releases Key Platform Enhancements that Help Put Data to Work for Everyone
Domo (Nasdaq: DOMO) announced several key enhancements to its cloud-based platform that make it easier than ever to put data to work for everyone, with the speed, scale and user experience (UX) needed to tackle any business challenge through data apps. Included in these platform enhancements are new functionality in Domo Multi-cloud, a new Governance Toolkit and updates to Universal Data Modeling, making it easier to understand, engage and leverage data across any organization.
“It is time for organizations to move beyond thinking of data as charts and graphs and towards adopting customized intelligent apps that not only deliver insights but drive action and support the needs of workers right where the work gets done,” said John Mellor, Domo CEO. “Our focus with Data Apps is supporting the white spaces in organizations where traditional BI and enterprise software applications like CRM and ERP have traditionally not reached. We’re making it easy for customers to put data to work for everyone by leveraging Domo as a low-code data app platform to build apps and improve business processes and outcomes everywhere work gets done.”
GoodData Launches Dashboard Plugins to Enable Seamless Customer Customization of Analytics Dashboards
GoodData™, a leader in data and analytics infrastructure, announced the release of dashboard plugins that enable enhanced capabilities for dashboards customization without the need of GoodData support. With the creation of dashboard plugins, GoodData customers are able to tailor their default dashboards experience with custom code based on the low-code GoodData.UI SDK for custom application development. They can modify existing visualizations, create new visuals, and build custom navigation and logic within the dashboard, or even integrate third-party content that interacts with the dashboard. Widely requested by customers, these plugins empower front-end developers to easily tailor existing GoodData dashboards without the need to build a front-end application from scratch. This is another step in the company’s journey toward widespread data literacy and accessibility.
“With the explosion of the data analytics industry, the importance for data insights to be readable and adaptable by anyone at an organization is insurmountable if you wish to grow a successful business. By adding capabilities such as these dashboard plugins, we are prioritizing developer experience and allowing our clients to experience a simpler-than-ever analytics experience,” said GoodData Founder and CEO Roman Stanek. “This level of accessibility builds upon our low-code/no-code mindset by allowing anyone at a company to tap into the organization’s data assets, services, and integrations — or data fabric — and compose new data-driven applications and solutions.”
Icertis Releases AI Studio to Democratize Use of AI-Powered Contract Intelligence
Icertis, the contract intelligence company that pushes the boundaries of what’s possible with Contract Lifecycle Management (CLM), released AI Studio, a self-serve, self-learning cognitive tool. AI Studio utilizes artificial intelligence (AI) models to analyze large sets of documents and generate contract intelligence for real-time insights and decision making. AI Studio’s availability coincides with the latest release of the Icertis Contract Intelligence (ICI) platform, which includes a streamlined, highly adaptable, and efficient user experience called ACE UX. As contract data becomes the new data pool in the enterprise, businesses should not only glean intelligence from their contracts but connect that data with systems across the enterprise to uncover growth opportunities and identify potential risks.
“CLM is emerging as the fifth system of record, unleashing a new pool of highly valuable enterprise data and driving demand for new AI tools that ensure the intent of every contract is fully realized—but this power needs to be in the hands of users, not just data scientists,” explained Monish Darda, CTO and Co-founder of Icertis.
HPE GreenLake edge-to-cloud platform delivers greater choice and simplicity with unified experience, new cloud services, and expanded partner ecosystem
Hewlett Packard Enterprise (NYSE: HPE) announced significant advancements to HPE GreenLake, the company’s flagship offering that enables organizations to modernize all their applications and data, from edge to cloud. Now, HPE’s market-leading hybrid cloud platform just got stronger, with a unified operating experience, new cloud services, and availability of HPE GreenLake in the online marketplaces of several leading distributors.
“HPE was among the first to deliver a cloud platform that enables customers to manage and extract insights from their data from edge to cloud, and our continued innovation is driving growth and furthering our market leadership,” said Antonio Neri, president and CEO, HPE. “In the hybrid cloud market, HPE GreenLake is unique in its simplicity, unification, depth of cloud services, and partner network. Today, we are furthering our differentiation, boldly setting HPE GreenLake even further apart as the ideal platform for customers to drive data-first modernization.”
New DDN Storage Appliance Doubles Performance for NVIDIA DGX AI Solutions and Speeds Up Analytics and Machine Learning in the Cloud by 100%
DDN®, a leader in artificial intelligence (AI) and multicloud data management solutions, announced its next-generation flash and hybrid data platforms for NVIDIA DGX PODTM and DGX SuperPODTM AI, analytics and deep learning computing infrastructure. Powering thousands of NVIDIA DGX™ systems, including NVIDIA’s Selene and Cambridge-1 DGX SuperPOD systems, DDN offers a broad range of optimized AI data storage solutions for applications such as autonomous vehicles, natural language processing, financial modeling, drug discovery, academic research, and government security.
The DDN A3I® AI400X2 system delivers real-world performance of more than 90 GB/s and 3 million IOPS to an NVIDIA DGX A100 system. Available with 250TB and 500TB all-NVMe usable capacity, and with the ability to scale orders of magnitude more, the DDN AI400X2 is the world’s most performing and efficient building block for AI infrastructures.
“DDN has been a market leader in AI, analytics and machine learning for many years and our collaboration with NVIDIA is leading the industry in performance, efficiency and ease of management at any scale,” said Dr. James Coomer, vice president of products, DDN. “With our next-generation flash and hybrid DDN AI400X2 storage systems, we are effectively doubling performance, improving ease of use and greatly expanding support for all AI users globally.”
Sequitur Labs Releases Turnkey Solution to Simplify Protection of Edge AI Models Powered by the NVIDIA Jetson Platform
Sequitur Labs announced a new release of its EmSPARK™ Security Suite for the NVIDIA Jetson™ edge AI platform, featuring a generally available development kit and pre-built Trusted Applications that provide robust security features and functions needed to protect AI models at the edge. Expanding Sequitur’s support for the NVIDIA Jetson platform, the latest release features new Trusted Applications supporting tools for protection of AI models on edge devices, and a turnkey evaluation image that can be implemented on the Jetson platform for development and integration. This new release works in concert with the NVIDIA JetPack™ SDK, including support for the latest version (JetPack 4.6).
Sequitur Labs’ EmSPARK Security Suite is designed to address solutions in industries where embedded security is paramount, in particular, protection of AI models at the edge. Supporting security functions for encryption, storage, data transmission and key/certificate management are delivered by EmSPARK and housed in a microprocessor’s secure memory partition. IoT hardware manufacturers use EmSPARK to easily implement device-level security by addressing technical, IP, supply chain and business process challenges. Developers can build applications that use secure resources without having to become experts in cryptography and complex chip-level security technologies.
“Building a release of EmSPARK for the NVIDIA Jetson platform is part of our commitment to providing a complete array of tools for protecting AI models on edge devices,” said Philip Attfield, co-founder and CEO of Sequitur Labs. “With this release, you can acces the latest test software with all of the documentation and tools you need. It’s a turnkey solution and it’s easier than ever to quickly integrate tools for securing devices and protecting AI models on the Jetson platform.”
JamLabs Announces Innovative Online Data Selling Platform
JamLabs Data Science (JamLabs) is launching jShop, a powerful, innovative cloud-based product management platform for companies with monetizable data. With jShop, clients will be able to monetize their valued data by launching online white-labeled stores. jShop activates data products across all data marketplaces and puts the power of data product management into the hands of data suppliers.
“jShop’s adapters allow customers to quickly load and publish their catalog from major platforms like AWS, Snowflake, and Databricks reducing onboarding time from weeks to days,” says Charlie Frantowski, JamLabs Chief Product Officer. “jShop lets our clients create licenses that bill according to gigabyte accessed or charge based on slices selected by customers”
Rivery Launches ETL Python Integration
Rivery, a leading data management company, announced the integration of Python as a native source or target for any data workflow, empowering data analysts and engineers with fully customizable and complex data workflows. This industry-first solution, available to all Rivery customers, means that Python DataFrames can now be used as a native source or target in their ETL/ ELT workflows, eliminating the need for writing any “plumbing” code in Python.
In the past, organizations looking to build an end-to-end data stack had to choose between ease of use in a “no-code” platform, or complex solutions built for engineers that support advanced use cases and customization. While Rivery’s no-code solution addressed most of the challenges data analysts and BI teams face, adding Python solves advanced needs including custom connectivity, complex transformations, AI/machine learning, and data enrichment.
Aviv Noy, CTO of Rivery said: “Python frees data engineers to customize their data workflows as far as their imagination can go.The addition of Python integration empowers Rivery users with full customization and data enrichment with machine learning and complex transformation.”
Super.AI Advances its Unstructured Data Processing Platform with Features Tailored for Shared Services
Super.AI announced the latest version of the company’s Unstructured Data Processing (UDP) Platform, making it easier for global business services and IT departments to expand the scope and pace of intelligent automation. Today, shared services centers must deploy multiple point solutions for document processing, sensitive information redaction, and processing other forms of unstructured data such as emails, text, images, video, and audio. Super.AI’s UDP Platform unifies intelligent document processing (IDP), human-in-the-loop (HITL), redaction, and processing of any data type—reducing the number of platforms needed for intelligent automation.
“To process unstructured data businesses have turned to different solutions for documents, emails, and sensitive information redaction,” said Brad Cordova, super.AI founder and CEO. “We have built a unified, modern platform for processing unstructured data and paired it with a rich marketplace of pre-configured AI applications. The goal is to help business users quickly turn AI models into AI applications that deliver a high ROI and offer guaranteed quality for unstructured data processing.”
Nyriad Announces the All-New UltraIO Storage System, Unleashing the Power of GPUs to Revolutionize Storage
Nyriad, Inc. announced the all-new UltraIO™ storage system, using the processing power of GPUs and advanced algorithms to deliver unprecedented performance, resiliency, and efficiency with low total cost of ownership. The UltraIO system supports block, file, and object data types in a single system, giving organizations the flexibility to consolidate storage and extend the system quickly and easily as needs dictate. In addition, the UltraIO system runs on industry-standard hardware, ensuring the system’s capabilities will improve as technologies become available. Combined with simplified management, these attributes let organizations deploy, manage, and nondisruptively scale the UltraIO system.
“Nyriad is dedicated to solving the biggest pain points of storage, many of which result from the limitations of RAID,” said Derek Dicker, Chief Executive Officer of Nyriad. “RAID has been the de facto standard for storage for more than three decades, but the demands of today’s data-intensive applications now far exceed its capabilities. With the UltraIO system, Nyriad built a new foundation for storage that addresses current problems and creates new opportunities for the future—without requiring organizations to make disruptive changes to their storage infrastructure.”
Getty Images Launches Model Release Supporting Data Privacy in Artificial Intelligence and Machine Learning
Getty Images, a preeminent global visual content creator and marketplace, announced the introduction of an industry first Enhanced Model Release form, recognizing advancements in data privacy and security and the growing importance of biometric data for the training of Artificial Intelligence (AI) and Machine Learning (ML) applications.
Developed with input from the Digital Media Licensing Association (DMLA), a leading body supporting business standards in visual content, the new form will provide clarity and guidance as to how data, including visual content, can be tracked and handled appropriately to protect the personal and biometric data captured by our content creators. We hope for it to be widely adopted and signed by models who feature in new commercial images and video on the Getty Images and iStock websites.
Biometric data is especially valuable because it can be used to recognise and map facial features extracted from visual content. Recently, there have been a spate of lawsuits around the use of biometric information without the explicit consent of people featured in visual imagery. While the law in this area is still evolving, developers should always start with collecting data from legitimate sources and obtaining authorization for its intended use.
“As AI and ML technologies evolve the visual content landscape, Getty Images remains committed to protecting the intellectual property rights of the content creator community as well as respecting the privacy and property rights of third parties,” said Paul Reinitz, Director of Advocacy and Legal Operations Counsel at Getty Images. “Although the potential applications of AI and ML are limitless, it is important to recognize that new tools and applications require us to rethink the interaction between technology and creative processes.”
Domino Data Lab Extends Enterprise MLOps to the Edge with New NVIDIA Fleet Command Support
Domino Data Lab, provider of a leading Enterprise MLOps platform trusted by over 20% of the Fortune 100, announced new integrations with NVIDIA that extend fast and flexible deployment of GPU-accelerated machine learning models across modern tech stacks – from data centers to dash cams.
Domino is the first MLOps platform integrated with NVIDIA Fleet Command™, enabling seamless deployment of models across edge devices, in addition to Domino’s recent qualifications for the NVIDIA AI Enterprise software suite. New curated MLOps trial availability through NVIDIA LaunchPad fast-tracks AI projects from prototype to production, while new support for on-demand Message Passing Interface (MPI) clusters and NVIDIA NGC™ streamline access to GPU-accelerated tooling and infrastructure, furthering Domino’s market-leading openness.
“Streamlined deployment and management of GPU-accelerated models bring a true competitive advantage,” said Thomas Robinson, VP of Strategic Partnerships & Corporate Development at Domino. “We led the charge as the first Enterprise MLOps platform to integrate with NVIDIA AI Enterprise, NVIDIA Fleet Command, and NVIDIA LaunchPad. We are excited to help more customers develop innovative use cases to solve the world’s most important challenges.”
Rescale Bolsters Support for Containerization of HPC and AI/ML Workloads in the Cloud
Rescale, a leader in high performance computing built for the cloud to accelerate engineering innovation, announced broader and deeper support to run containers on any cloud and any specialized architecture, enabling companies to deploy their custom and open source science and engineering applications with greater simplicity and cost-performance on any hybrid cloud architectures.
“Organizations powering computational science and engineering with HPC can face daunting challenges of complexity,” said Adam McKenzie, CTO and Founder at Rescale. “Containers on Rescale help to extend the simplicity and full-stack cloud automation we bring our customers into containerized workloads. The rapid growth of AI/ML, open-source, and custom applications is driving a convergence with HPC. Rescale provides a bridge for these technologies to not just run applications, but supercharge them with the latest architectures whether you care about optimizing cost or speed while at the same time giving IT the security and control they require.”
Rafay Systems Powers AI and Machine Learning Applications at the Edge by Streamlining Operations for GPU-based Container Workloads
Rafay Systems, a leading platform provider for Kubernetes Operations, announced the expansion of the industry’s only turnkey solution for operating Kubernetes clusters with GPU support at scale by adding powerful new metrics and dashboards for deeper visibility into GPU health and performance.
The Rafay Kubernetes Operations Platform (KOP) now features a fully integrated GPU Resource Dashboard that visualizes critical GPU metrics so developers and operations teams can seamlessly monitor, operate, and improve performance for GPU-based container workloads – all from one unified platform.
“Rafay makes spinning up GPU-enabled Kubernetes clusters incredibly simple. In just a few steps an enterprise’s deep learning and inference projects can be fully operational,” explained Mohan Atreya, SVP Product and Solutions at Rafay Systems. “Not only do we provide the fastest path to powering environments for AI and machine learning applications, but the combination of capabilities in Rafay KOP enables scalable edge/remote use cases with support for zero-trust access, policy management, GPU monitoring and more across an entire fleet of thousands of clusters.”
DataStax Astra DB Now Delivers Connected, Real-Time Data Pipelines Powered by Streaming
DataStax, the real-time data company, unveiled “change data capture” (CDC) for Astra DB, a new capability for its multi-cloud database built on Apache Cassandra™. The new CDC for Astra DB is powered by advanced streaming technology built on Apache Pulsar. It processes and delivers database changes, in real time, via event streams making real-time data available for use across data lakes, data warehouses, search, artificial intelligence and machine learning. This powerful integration of event streaming with the high-scale, high-performance Astra DB multi-cloud database enables any organization to create smarter and more reactive applications fueled by connected, real-time data.
“CDC for Astra DB capability is unique in that it taps the performance of the world’s most scalable database and combines it with next-generation event streaming to fuel today’s data-intensive applications with a solution that just works,” said Ed Anuff, chief product officer at DataStax.
Sign up for the free insideBIGDATA newsletter.
Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1