insideBIGDATA Latest News – 6/16/2021

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Vertica Announces Early Access of Vertica Eon Accelerator 

Vertica announced early access of its new unified analytics-as-a-service offering, Vertica Eon Accelerator, delivering analytics at top speed from start to finish. Available via the Early Access Program, Vertica Eon Accelerator delivers the high-performance and scalable analytics as well as end-to-end, in-database machine learning to organizations that require the right level of resourcing, management, and control for each analytical use case – all built on the proven cloud native architecture of Vertica in Eon Mode, separating compute from storage and leveraging S3 Object Storage on AWS.  

“There’s a clear market need for advanced analytics and machine learning delivered as a service. However, current cloud data warehouses force organizations to sacrifice on performance and scale and offer an inflexible ‘black box’ approach,” says Colin Mahony, GM and SVP of Vertica, Micro Focus. “Vertica Eon Accelerator offers organizations much greater transparency and control – control of their environment, control over tuning their queries, ownership of their data – all while relying on a field-proven, cloud-optimized architecture that delivers the highest levels of performance at extreme scale with forecastable and transparent pricing. Our acquisition of Full 360 will further enhance our managed service in meeting this huge market opportunity.”

DNA Data Storage Alliance Publishes First White Paper, Launches Website

The DNA Data Storage Alliance, an organization of more than 25 leading companies formed by Twist Bioscience Corporation (NASDAQ: TWST), Illumina, Inc. (NASDAQ: ILMN) and Western Digital (NASDAQ: WDC) together with Microsoft Research, announced its first white paper titled “Preserving our Digital Legacy: An Introduction to DNA Data Storage.” The white paper, which can be found on the newly launched DNA Data Storage Alliance website, www.dnastoragealliance.org, presents DNA data storage fundamentals in an accessible way for both technically curious readers and for IT business, computer science or electrical engineering readers interested in the benefits, a technical overview, and the cost of ownership of this potential new storage medium. It discusses why DNA data storage is needed and expected to address the exponential growth of digital data.

“It’s undeniable that data growth is outpacing the scalability of today’s storage solutions. Literally, everything we do revolves around data – and capturing, storing, processing and mining it only serves to create even more data. The density and stability of DNA storage will help the industry cost-effectively cope with the expected future growth of archival data for many decades to come,” said Steffen Hellmold, vice president, corporate strategic initiatives, Western Digital.

MLCommons™ Launches MLPerf™ Tiny Inference Benchmark

MLCommons, an open engineering consortium, launched a new benchmark, MLPerf™ Tiny Inference, to measure how quickly a trained neural network can process new data for extremely low-power devices in the smallest form factors and included an optional power measurement. MLPerf Tiny v0.5 is the organization’s first inference benchmark suite that targets machine learning use cases on embedded devices. 

Embedded machine learning is a burgeoning field where AI-driven sensor data analytics is performed in real-time, close to where the data resides. The new MLPerf Tiny Inference benchmark suite captures a variety of use cases that involve “tiny” neural networks, typically 100 kB and below, that process sensor data such as audio and vision to provide endpoint intelligence. 

The first v0.5 round included five submissions from academic, industry organizations, and national labs, producing 17 peer-reviewed results. Submissions this round included software and hardware innovations from Latent AI, Syntiant, PengCheng Labs, Columbia, UCSD, CERN, and Fermilab. To view the results, please visit https://www.mlcommons.org/en/inference-tiny-v01.

“Tiny machine learning is a fast-growing field and will help to infuse ‘intelligence’ in the small everyday items that surround us,” said MLPerf Tiny Inference working group chair, Colby Banbury, of Harvard University. “By bringing MLPerf benchmarks to these devices, we can help to measure performance and drive efficiency improvements over time.”

Cloud-native Altair® SmartWorks™ Empowers Enterprises to Make Data-driven Decisions 

Altair (Nasdaq: ALTR), a global technology company providing solutions in simulation, high-performance computing (HPC), and artificial intelligence (AI) announced the release of Altair SmartWorks, its next-generation, cloud-native platform to empower augmented, data-driven decision making. SmartWorks harnesses the full power of AI, analytics, and the Internet of Things (IoT) to help organizations improve and ensure production quality, develop connected product lines, optimize maintenance schedules, and implement recurring revenue models. It also allows companies to customize marketing analytics, automate financial systems, and more.

“SmartWorks will disrupt the way businesses innovate and make decisions by making it easy to leverage AI and IoT in automation and analytics,” said James R. Scapa, founder and chief executive officer, Altair. “For organizations strategizing their digital transformation efforts, SmartWorks is a future-proof accelerator and launchpad. It will help companies accelerate, innovate, and deliver.”

Dataiku Announces Fully Managed, Online Analytics Offering

Dataiku, the advanced Enterprise AI platform, announced the launch of Dataiku Online, which makes their AI and machine learning platform available as an online service for smaller, more agile organizations. Dataiku Online is fully managed by Dataiku, which allows companies of any size to access the full power of its platform without added pressure on IT to install and manage resources.

“Accessibility has always been of the utmost importance at Dataiku. We developed Dataiku Online to address the needs of small and midsize businesses, in addition to startups,” said Dataiku CEO Florian Douetteau. “We want to help companies that are just beginning their data and analytics journey to access the full power of our platform, where they can start by enhancing their day-to-day operations with simple data tools and then take their data even further with machine learning. Companies don’t need big data to do big things with their data, and Dataiku Online will make it easier for a whole new class of companies — from lean startups to scaling SMBs — to start.”

Swan Lake Beta Release of Ballerina Programming Language Lowers Barriers to Delivering Cloud-Native Applications

Increasingly enterprises are turning to cloud-native applications that integrate APIs, events, data, microservices, serverless apps, and other digital assets throughout their organizations and across the ecosystems in which they participate. Ballerina is the open-source language for cloud-native programming and integration designed to support these organizations with a unique bidirectional mapping of sequence diagrams and code. The new Swan Lake Beta release, available today, radically simplifies how developers build and deploy cloud-native applications through an intuitive syntax for developing services and APIs, seamless JSON support, and built-in concurrency control.

“Modern applications are not islands—nor are the teams of developers responsible for building them,” said Dr. Sanjiva Weerawarana, founder and leader of the Ballerina project and founder and CEO of WSO2. “The Ballerina language enables developers to create cloud-native applications that are inherently integrations of services, data, transactions and processes. The new Swan Lake Beta version extends this functionality—further enabling enterprises to tear down the barriers between app development and integration and between highly skilled and ad hoc developers to speed the delivery of innovative, new digital products and services.”

Pythian Launches FinOps Managed Services

Pythian Services Inc. (“Pythian”), a leading data, analytics and cloud services company, announced that the company has added cloud financial management, also known as FinOps, to its portfolio of Managed Services offerings. By helping organizations better manage the tradeoffs between cost, performance and availability, Pythian’s FinOps services maximize the business value of an enterprise’s cloud investment while saving an average of 25 percent on monthly cloud costs. Pythian’s FinOps services provide the visibility, tools and guidance necessary to help enterprises control, manage and optimize cloud costs. These insights into cloud cost, usage and performance metrics are available to a broad set of stakeholders within the enterprise through sophisticated reporting and analytics capabilities. 

“Pythian’s FinOps services promote a sustainable culture of financial accountability in terms of cloud investment—a paradigm shift for many organizations,” said Lynda Partner, senior vice president of products and offerings at Pythian. “Pythian’s Dedicated Cloud Advisors are central to FinOps services, providing ongoing guidance to interpret and act upon information made available through advanced reporting and analytics, ultimately enabling our customers to strike a balance between cloud cost and performance.”

GRAX Announces History Stream™, Unleashing SaaS App Data for Easy Downstream Consumption

GRAX, Inc., a leading SaaS Data Value Platform, announced the release of History Stream™. This breakthrough DataOps solution slashes the complexity, time, and cost required to pipe SaaS app data to the analytics and operations tools business users rely on. With History Stream, customers effortlessly stream native SaaS app data stored in their own AWS or Azure cloud data lake into industry-standard tools, seamlessly integrating with the rest of their data ecosystem. 

“Businesses depend on historical data. The more detail they have about current activities and trends, the better they can react to market changes and predict and shape their own future. Much of this data resides in SaaS applications, but the complexities inherent in making it accessible are enormous. Now, instead of having to code, businesses can make every version of that data widely and securely available with just a few clicks,” said Joe Gaska, CEO of GRAX. “History Stream gives businesses a streamlined, hassle-free, and efficient way to unlock strategic value from SaaS data by pushing it straight from the cloud environment they already own. We un-silo the data and serve it up in a way that is actionable, in whatever tools businesses want to consume it.” 

GoodData Brings Enterprise-Grade Analytics to the 21st Century with Launch of GoodData.CN Production Editions

GoodData, a leader in Data as a Service analytics infrastructure, launched the Production Editions of its new platform, GoodData Cloud Native (GoodData.CN). Building on the release of its Community Edition for developers, GoodData.CN Production is the first solution that gives customers robust enterprise-grade analytics delivered as a microservices-based analytics stack. GoodData.CN Production includes three plan options: Free, for simple production deployments; Growth, for self-service analytics; and Enterprise, for large production deployments with enhanced features. 

“Legacy data architecture continues to hold countless enterprises back from achieving analytics at scale and delivering business value. With the Production Editions of GoodData.CN, we finally have the ability to architect a new, modern future for analytics,” said GoodData CEO and Founder Roman Stanek. “It’s the first solution of its kind to leverage the best practices for modern IT, while plugging into your existing infrastructure for quick deployment and scale.”

AtScale Deepens Snowflake Integration with Snowpark for Advanced Automation and Orchestration

AtScale, a leading provider of semantic layer solutions for modern business intelligence and data science teams, announced integration with Snowflake’s Snowpark Java UDFs. Snowpark enables AtScale to execute advanced analytic functions within the Snowflake Data Cloud, further optimizing complex analysis, query performance and resource consumption, resulting in unparalleled ROI. This initial integration provides a foundation for embedding additional, high-value services within the Snowflake Data Cloud. The AtScale semantic layer provides BI analysts and data scientists with a business-oriented, virtualized view of data within cloud data platforms that can be accessed through applications like PowerBI, Excel, Tableau, and DataRobot. As organizations accelerate migrations to the Snowflake Data Cloud, the ability for BI and data science teams to maintain consistent logical views and high-performance access is critical.

“AtScale shields data consumers from the complexity of raw data and gives them the ability to access data with the tools of their choice,” said Scott Howser, Chief Customer Officer at AtScale. “Snowpark lets us enhance our customers’ experience, delivering accelerated query performance and greater scalability for business intelligence and data science teams working on the Snowflake Data Cloud.”

Element Announces Element Unify Integration with AWS IoT SiteWise to Enable Condition-based Monitoring for Industrial Customers

Element, a leading software provider in IT/OT data management for industrial companies, announced a new offering featuring an API integration between its Element Unify product and AWS IoT SiteWise, a managed service from Amazon Web Services, Inc. (AWS), that makes it easy to collect, store, organize, and monitor data from industrial equipment at scale. The API integration is designed to give customers the ability to centralize plant data model integration and metadata management, enabling data to be ingested into AWS services, including AWS IoT SiteWise and Amazon Simple Storage Service (S3) industrial data lake.

“Operations data must be easy to use and understand in both the plant and in the boardroom. Industrial organizations who transform their plant data into context data for use in AWS IoT SiteWise, and other AWS services, will drive greater operational insights and achieve breakthrough business outcomes,” said Andy Bane, CEO of Element.

Planful Debuts “Predict: Signals,” a Native AI and ML Anomaly Detection Technology for FP&A

Planful Inc., the pioneer of financial planning, analysis (FP&A), and consolidations cloud software, announced the launch of “Predict: Signals,” the first of a range of product releases in the Planful Predict portfolio, a suite of native artificial intelligence and machine learning (AI/ML) products that will be released in 2021 and beyond. Predict: Signals, a native Al/ML anomaly detection technology, eliminates the need for a detailed manual review of data, ensuring forecasts are accurate and alerting the business to outliers in data. The solution checks for abnormalities, identifies patterns, and augments planning and decision-making efforts with intelligent forecasts and recommendations, using a native AI/ML engine.

“The Planful Predict suite of applications will further modernize how finance and accounting professionals accomplish their work through three key attributes,” said Grant Halloran, Planful’s Chief Executive Officer. “First, as a super-powered digital assistant that operates like a million sets of extra eyes, 24/7, searching for anomalies in financial data. Second, as a technology that supports, yet doesn’t replace the skills of intuitive human planners and analysts. Third, as a frictionless aid that is native in the platform at the point of need, thus eliminating the need for third-party AI. Predict: Signals has the ability to augment a user’s efficiency in data signal detection by orders of magnitude.”

DataRobot Delivers Scalable Inference with Snowflake Snowpark

DataRobot, a leader in Augmented Intelligence, announced a new integration with Snowflake, the Data Cloud company, to bring the power of Snowpark—a new developer experience created by Snowflake—to DataRobot users. This partnership comes on the heels of DataRobot’s recent acquisition of Zepl, unlocking new capabilities within DataRobot’s platform for the most advanced data scientists. As demonstrated during Snowflake’s 2021 Summit, Zepl’s capabilities—now part of DataRobot’s platform—will help joint users quickly develop, train, and deploy models by providing a preconfigured, fully featured environment for Snowpark-driven model development. The multi-layered integration builds on DataRobot and Snowflake’s existing strategic partnership and integrations to empower every organization to succeed with AI.

“The Snowflake and DataRobot partnership delivers unmatched data and AI synergies, and we’re proud to be extending the value we can drive for customers even further,” said Nenshad Bardoliwalla, SVP of Product, DataRobot. “By creating a best-in-class integration with Snowpark, we’re giving customers the ability to deploy their AI models directly into their Data Cloud, extending the power of our Augmented Intelligence to where their business-critical data lies. Together with Snowflake, we are closing the AI production gap for good.”

Synthesis AI Achieves Largest Synthetic Data Set in the Industry with 40,000 Unique Identities

Synthesis AI, a pioneer in synthetic data technologies, announced they have released 40,000 unique high-resolution 3D facial models.  Through the company’s synthetic data-as-a-service FaceAPI solution, users can now programmatically create perfectly labeled image training data spanning 40,000 unique identities. Demonstrating Synthesis AI’s commitment to addressing ethical AI issues related to bias and privacy, this data set not only represents the largest collection of 3D facial models available anywhere, but also is the most diverse, spanning gender, ethnicity, age, and BMI.

“On the heels of our recent funding announcement, we are excited to continue this type of growth and momentum as a company,” said Yashar Behzadi, CEO of Synthesis AI. “Our goal is to address bias and privacy in AI and to democratize access to high-quality data. Making 40,000 unique identities available further strengthens this mission, while also addressing the technical, economic, and ethical issues with current approaches.”

Starburst Announces Stargate, A Gateway For Global Cross-Cloud Analytics

Starburst, the analytics anywhere company, announced Starburst Stargate. Stargate, available as an add-on for Starburst Enterprise customers, enables organizations to run cross-cloud analytics on data distributed across the globe. Additionally, Stargate ensures compliance with global data sovereignty regulations, enabling organizations to analyze data residing across borders without moving it.

“The ever-changing global regulatory environment makes it challenging for both the C-suite and data engineers alike to manage data sets across multiple clouds and availability zones around the world,” said Justin Borgman, Co-Founder & CEO at Starburst. “Until now, data privacy and data sovereignty regulations have been a major obstacle to timely and accurate analytics. Companies are seeking better solutions that allow for efficient cross-cloud and cross-region analytics that don’t require data movement. Starburst Stargate is the final frontier in our vision of enabling analytics anywhere.”

StarTree Announces Commercial Availability of User-Facing Analytics Platform Raising the Bar for Speed, Scalability and Performance

StarTree, Inc. announced the commercial availability of its “blazing-fast” cloud analytics-as-a-service platform, making it easier for organizations to share self-service analytics with their most important external stakeholders: their customers, partners, and members. Built around the popular Apache Pinot™, the new StarTree Cloud uniquely delivers fresh analytics at scale with low latency for thousands or millions of users. Built for progressive data-driven companies to help increase engagement and revenue through real-time user-facing analytics, the platform enables companies to build rich, interactive data-driven analytical applications to empower their customers and users to make informed decisions with the most up-to-date information.

“Companies have a remarkable opportunity to provide insights to their users to help drive decision making and behavior,” said Kishore Gopalakrishna, co-founder and CEO of StarTree. “The next revolution in analytics is the recognition that every user is a decision maker in their own right, and that companies can add value by helping users make smart decisions.”

Domo Introduces New Domo Everywhere Offering

Domo (Nasdaq: DOMO) rolled out its new Domo Everywhere offering that updates components of its existing embedded analytics solutions and makes available new capabilities to deliver a more powerful experience for organizations to improve customer experiences and create new revenue streams leveraging Domo’s powerful platform. The new Domo Everywhere goes beyond legacy embedded analytics to enable sharing and collaboration of data in a highly curated experience. Domo Everywhere is the first offering that enables organizations to deliver a complete modern BI experience to their customers and partners. As part of today’s news, Domo’s existing set of embedded analytics products, including Domo Publish and Domo Embed, will be consolidated under Domo Everywhere, making it easier for customers to buy, deploy and benefit from this comprehensive embedded analytics solution.

“Domo is pushing external BI beyond the traditional ‘places’ that analytics content is embedded and the value that is received,” said Jay Heglar, Chief Business Officer, Domo. “With Domo Everywhere, we’re giving customers an easy way to create, deliver and monetize high value data solutions in a fraction of the time it would normally take and with a fraction of the resources required of any other approach. Domo provides the canvas to create and the platform to distribute data solutions to all customers making monetization of data, which can be a trapped asset, a reality.” 

Mipsology’s New Zebra FPGA IP Accelerates Neural Network Computing
for Edge and Embedded AI Applications

AI software innovator Mipsology announced the availability of Zebra FPGA IP, a solution that accelerates the development and deployment of FPGA and adaptive SoC-based machine learning systems.  Zebra FPGA IP is optimized to power edge applications spanning industrial automation, security, autonomous vehicles, smart cities, super resolution and more. Zebra IP is currently being used by early access partners and is expected to ship early Q3.

“Zebra IP has the same features and behavior as our Zebra inference accelerator running on PCIe cards, but in a much more flexible format, so that developers can enhance their systems with additional capabilities,” said Ludovic Larzul, CEO and founder of Mipsology. “They can integrate complex functions around Zebra to create smart systems deployed everywhere. The possibilities are endless.”

Independent Benchmark Demonstrates First Trillion-edge Knowledge Graph – Stardog Recognized for Providing Sub-Second Query Times on Hybrid, Multicloud Data

A new benchmark study from McKnight Consulting Group (MCG) unveiled the first demonstration of a massive knowledge graph that consists of materialized and virtual graphs spanning hybrid multicloud platforms. MCG affirmed that Stardog, a leading Enterprise Knowledge Graph platform provider, enabled users to create a one trillion-edge knowledge graph with sub-second query times without the need to store all the data in a single, centralized location. This is critical because today’s changing and proliferating IT environment requires data connections that don’t rely on centralization and/or the need for data movement.

“Until recently, the predecessors to the modern data fabric have been data federation and virtualization technologies; however, a majority of these platforms have failed to deliver true inter-connectedness at scale with performance, because they are hampered by bottlenecks inherent in all databases and data stores in the query chain,” said Evren Sirin, CTO and co-founder of Stardog. “Instead of tackling the problem with another abstraction layer, graph model connects data so organizations can understand the data relationships, determine the prioritization of nodes in the relationship, and leverage visualization so it is easier for users to search, investigate, and analyze data, and expose patterns and trends by connecting diverse forms of connected knowledge.”

Graphite: Here’s a platform that revolutionizes the business intelligence space

More and more companies are turning to predictive analytics to not only meet customer expectations but capitalize on every insight they provide. Big players can afford an army of data scientists to use several statistical techniques like data mining, machine learning, and predictive modeling to gain a competitive advantage. But SMEs and startups struggle to gain the benefits of advanced analytics. AI and machine learning techniques are expensive and time-consuming. Moreover, they generate charts and numbers, which are very difficult to understand. Fortunately, here’s a tool that automates predictive analytics without writing a single line of code. Graphite is an automated predictive analytics platform with a data storytelling capability. 

“You don’t need a data scientist team to make sense of your data anymore. Graphite will run Predictive Analytics algorithms on your data and help you to combine visuals and narrative to share insights with your team so they can understand it.” – Hrvoje Smolic, Founder at Graphite Note.

Postgres Professional Announces Availability of Postgres Pro Enterprise 13

Postgres Professional, the company that makes PostgreSQL enterprise-ready, announced the availability of Postgres Pro Enterprise 13. Based on PostgreSQL 13.3, Postgres Pro Enterprise 13 includes all the new features introduced in PostgreSQL 13, as well as key Postgres Professional innovations, such as hot minor upgrades, storage-level compression and 64-bit transaction identifiers. The company has also announced it has open sourced its innovative multi-master cluster, making these capabilities available to the community.

“We are very excited to bring Postgres Pro Enterprise 13 to market because it enables enterprises to update their production systems with the latest PostgreSQL features and fixes while still taking advantage of all the advanced features that Postgres Pro Enterprise provides,” said Oleg Bartunov, Postgres Professional Founder and CEO. “Postgres Pro Enterprise 13 will enable more companies in more industries to take advantage of all the great benefits of PostgreSQL. Further, by open sourcing the multi-master functionality, we are opening this key capability to a wider audience of developers and users.”

Quest Data Modeling and Data Intelligence Enhancements to Strengthen 
Data Operations and Governance for Data Empowerment

 Quest Software, a global systems management, data protection and security software provider, announced important enhancements to erwin Data Modeler and erwin Data Intelligence only five months after its acquisition of erwin, Inc. When integrated, these solutions provide a closed loop for metadata management, and automate both processes and workflows to improve time to value for key digital transformation initiatives, such as cloud migration, optimize regulatory and risk compliance efforts, and increase data literacy. erwin Data Modeler and erwin Data Intelligence are part of the Information and Systems Management (ISM) business and its Quest Data Empowerment Platform, best-of-breed solutions for data operations, data protection and data governance to guide operational and strategic decision-making. 

“These new erwin by Quest product releases demonstrate our commitment to advancing data modeling and data governance technologies because customers increasingly rely on the data intelligence they provide to compete in today’s data-driven world,” explained Heath Thompson, president and general manager for Quest ISM. “As the undisputed, worldwide leader in data modeling and a key provider of data protection and data governance solutions, we’re dedicated to our customers’ success not only in risk management but also in using their enterprise data assets to create greater value across all functions. The Quest Data Empowerment Platform ensures this by giving IT and business teams the data operations, protection and governance capabilities they require, in addition to making a real-time, relevant and high-quality data pipeline accessible to a broader range of stakeholders for enterprise collaboration and decision-making.”      

Qlik Launches Data Literacy 2.0 to Drive Data Fluency in the Enterprise  

Qlik® announced the next generation of its award-winning Data Literacy program to help drive the data fluency needed for a world witnessing continued digital acceleration. Data Literacy 2.0 is a comprehensive offering that enables companies to kickstart a data literacy program and scale it to thousands of employees. The Data Literacy 2.0 program enables organizations to start with a small initial investment, then affordably scale data literacy across the entire workforce. It is designed to take individuals from data novice to data fluent; where they can confidently interpret and work with data and help drive best practices in data-driven decision-making across their organization.  

“Being data literate can’t be a specialist skillset or limited to those with technical ability. Everyone needs to be fluent in data in both their personal and professional lives,” said Dr. Paul Barth, Global Head of Data Literacy at Qlik. “As we emerge from global lockdowns and accelerate recovery plans, business leaders need to arm their employees with the skills to succeed in an increasingly digital and data-driven workplace. If they don’t, they not only risk losing talent to organizations making greater investments in employee upskilling, but they stand to undermine the future productivity, performance and competitiveness of their business.”  

TYAN Delivers AI and Cloud Optimized Systems based on the 3rd Gen Intel® Xeon® Scalable Processors

TYAN®, an industry-leading server platform design manufacturer and MiTAC Computing Technology Corporation subsidiary, brings3rd Gen Intel® Xeon® Scalable processor-based server platforms featuring built-in AI acceleration, enhanced security, and a 2X increase in PCIe 4.0 I/O for the most demanding requirements across HPC, cloud, storage and 5G workloads.

“With AI and cloud growth, our new servers based on 3rd Gen Intel Xeon Scalable processors are designed to deliver a diversity of workloads for high performance systems, like AI inference or deep learning platforms, to a cost-optimized high IOPS cloud storage platform,” said Danny Hsu, Vice President of MiTAC Computing Technology Corporation’s Server Infrastructure Business Unit.

SYSTRAN Launches “SYSTRAN Translate PRO”

As the light at the end of the pandemic tunnel gets brighter, it’s time that small and medium-sized businesses – the backbone of the global economy – begin taking necessary steps to bounce back. Today, SYSTRAN, the leader in AI-based translation technology, announces the launch of its newest cloud-based software, “SYSTRAN Translate PRO”, designed to equip SMBs, freelancers and organizations with the ability to communicate across language borders and conquer new markets in just a few clicks.

“Communication is a critical component for companies that want to be agile,” states Director of Cloud Sales at SYSTRAN, Keith Jameson. “Our SaaS solution can help provide smaller businesses with the same technology being used by large enterprise corporations that benefit from a global presence. We listened to customer requests and it became apparent that we needed to develop a solution for SMBs so they too could provide sales support and customer service across language barriers.”

Amplitude Unveils New Experimentation Application for Digital Optimization

Amplitude, the Digital Optimization System, introduced Amplitude Experiment, the industry’s first experimentation solution powered by customer behavior and product analytics. Amplitude Experiment provides organizations an end-to-end experimentation and delivery workflow that integrates customer data into every step from generating a hypothesis to targeting users to measuring results. Organizations are empowered to run higher impact A/B tests and remotely configure experiences for key segments without incremental engineering work. 

“Every company needs to be digital first, but too many are guessing what they should build next and wasting time on experiments that are doomed to fail,” said Justin Bauer, SVP Product, Amplitude. “Since Experiment is powered by customer behavior, businesses can free themselves from low-impact activities and get the invaluable insight that can accelerate and truly scale how they design and deliver digital products and experiences. This is an entirely new chapter for A/B testing that enables any company – whether they’re digital native or embarking on their digital transformation journey – to make bigger and smarter bets that drive stronger business results.”

DataSecOps Innovator Satori Launches Self-Service Data Access to Streamline Enterprise Data Access management

Satori, a DataSecOps company revolutionizing data access, security and privacy for the modern data infrastructure, announced Self-Service Data Access. The new capability democratizes workflows for enterprise data access requests from a three-week, IT-driven process to a five-minute, self-directed operation, while preserving permission, authentication and security policies.

“Granting and revoking access to data is a big headache for data engineers in most organizations, and an unnecessary delay in data access for analysts,” said Yoav Cohen, CTO and co-founder of Satori. “Our new Self-Service Data Access automates the entire data access process from the access request, triggered by an analyst, through review and approval by the data steward to automatic access revocation when the data is no longer used or needed. The efficiency gains are enormous, cutting what typically takes weeks into a few minutes, while maintaining security and privacy policies.”

Dremio’s Dart Initiative Accelerates the Obsolescence of Cloud Data Warehouses

Dremio, an innovation leader in data lake transformation, took a major step forward in obsoleting the cloud data warehouse. The release marks the first delivery in the company’s Dart Initiative, which enables customers to run all mission-critical SQL workloads directly on the data lake. Dremio is a service that sits between data lake storage and end users who want to directly query that data for high-performing dashboards and interactive analytics, without the need for copying data into data warehouses and the need for creating aggregation tables, extracts, cubes and other derivatives. Dremio drastically simplifies the data architecture, accelerates query performance, and enables data democratization without the vendor lock-in of cloud data warehouses.

“Enabling truly interactive query performance on cloud data lakes has been our mission from day one, but we’re always looking to push the boundaries and help our customers move faster. We launched the Dart Initiative to deliver just that,” said Tomer Shiran, founder and chief product officer at Dremio. “Not only are we dramatically increasing speed and creating efficiencies, we’re also reducing costs for companies by eliminating the data warehouse tax without trade-offs between cost and performance.”

TruEra Launches TruEra Monitoring: the First Highly Accurate and Actionable Solution for Monitoring and Debugging AI and Machine Learning Models in Production

TruEra, which provides the first suite of AI Quality solutions for AI explainability and model quality, launched a new offering: TruEra Monitoring, the first highly accurate and actionable solution for monitoring and debugging Artificial Intelligence (AI) and Machine Learning (ML) models in production.

“We heard clearly from our customers that what they really needed was a solution that goes beyond basic machine learning performance,” said Shayak Sen, CTO, TruEra. “TruEra Monitoring is based on our powerful AI Quality Platform, featuring world class AI explainability and model quality analytics. This means that TruEra Monitoring not only quickly identifies and alerts you to emerging issues but also accurately pinpoints the cause of those issues, so that data scientists can quickly debug models and get them back to high performance. It’s a game changer for data science teams, who are tired of wild goose chases.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*