Advancing Graphene Technology in Malaysia – OpenGov Asia – OpenGov Asia

To better serve and protect communities, maintain data security at scale, and perform essential tasks, all government agencies must establish a strong, contemporary data infrastructure that supports data innovation.

Government and the public sector stand to gain considerably by adopting AI into every element of their job. Government AI must consider privacy and security, compatibility with old systems, and changing workloads.

Artificial intelligence is already being used to help run the government, with cognitive applications doing everything from reducing backlogs and cutting costs to handling tasks that humans cannot easily do, such as predicting fraudulent transactions and identifying criminal suspects using facial recognition.

While AI-based technology may fundamentally transform how public-sector employees do their jobs in the coming years — such as eliminating some jobs, redesigning countless others, and even creating entirely new professions — it is already changing the nature of many jobs and revolutionising aspects of government operations.

AI in government services is centred on machine learning and deep learning, computer vision, speech recognition, and robotics. When used correctly, these techniques yield real, measurable results.

Cyber anomaly detection, on the other hand, has the potential to transform cybersecurity strategies in government systems. The possibilities are endless, but they are only now taking shape.

The OpenGov Breakfast Insight on 4 August 2022 offered the most cutting-edge innovative method for enabling large-scale analytics in the public sector.

Public Sector Services Powered by Data and AI

Mohit Sagar: Technologies to cope with new demands
Mohit Sagar: Innovate faster by leveraging the cloud’s capacity and democratising safe data access

Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, OpenGov Asia acknowledges that data and artificial intelligence will drive the future of government services.  “With a unified data platform, the public sector will be able to better serve citizens and protect their communities.”

Governments, in general, are one of the world’s largest employers, with numerous ministries, agencies and departments. The vast network of offices and services introduces significant complexity, operational inefficiencies and, frequently, a lack of transparency.

Agencies must deal with massive amounts of data in various structured and unstructured formats, which will only increase over time. Moreover, they are unable to recognise nor take advantage of the full potential of data, analytics and data due to legacy systems and traditional data warehouses. These are, more often than not,  classified by agencies and departments, sabotaging their efforts to undergo digital transformation.

To generate real-time actionable insights and make data-driven decisions, data must be securely shared and exchanged at a scale. Giving government organisations and policymakers access to deeper, more relevant insights into decision-making is only possible through data modernisation.

It is given that much of the information that government agencies oversee is extremely sensitive, including information about the nation’s infrastructure, energy and education as well as information about personal health and financial matters. Data protection at every level of the platform must be ensured through tight interaction with granular cloud provider access control methods.

The fact is that citizens stand to gain through more individualised and effective services, enhanced national security, and wiser resource management that a robust data strategy can give.

Government agencies may adapt to readily access all their data for downstream advanced analytics capabilities to support complicated security use cases by integrating data with analytics and AI.

With such a platform, government security operations teams can quickly identify sophisticated threats, minimising the need for human resources by analytical automation and collaboration and speeding up investigations from days to minutes.

Data stored by public sector bodies can be extremely valuable when shared with other departments and used to elevate data-driven decision-making. The time has come to leverage the cloud’s scale and democratise secure data access to enable downstream BI and AI use cases, allowing government agencies to accelerate innovation.

Governments can improve citizen services while implementing smarter and more transparent governance by leveraging data, analytics and AI for actionable insights at scale. It eliminates data silos and improves communication and collaboration across agencies to achieve the best results for all citizens, delivering personalised citizen services while achieving data security and cyber resilience for a satisfied population.

Building a Scalable Data, Analytics and AI Strategy with Lakehouse Platform

Chris D’Agostino: Data leaders are at the forefront of shaping a sustainable future

Data infrastructure is an essential aspect of data processing and analysis, according to Chris D’Agostino, Global Field CTO, Databricks.

The complete backend computing support system needed to process, store, transfer and preserve data is referred to as the “data infrastructure.” Without the appropriate data infrastructures, businesses and organisations cannot extract value from their data.

“If there’s one thing that many of us all have in common, it’s that we believe in the impact that data and AI can and will have on the world,” says Chris. “Today, data and AI are transforming every major industry.”

On the other hand, with the ongoing globalisation of artificial intelligence and machine learning, there is an increasing need to rethink an organisation’s whole leadership and thought process, from product strategy and customer experience to strategies to increase the efficiency of human resources.

Rules, models and policies that specify how data is gathered stored, used and managed in the cloud within a company or organisation are contained in cloud data architectures. It controls the data flow, processing, and distribution of that data across stakeholders and other applications for reporting, analytics and other purposes.

Every year, data collection by businesses and organisations increases thanks to IoT and new digital streams. In this climate, cloud data architecture-based data platforms are displacing more conventional data platforms, which are unable to handle the growing data quantities and increasingly demanding end-user applications like machine learning and AI.

Companies are using all available data to expedite, automate and improve decision-making to increase resilience and obtain a competitive edge in the market. These methods for digital transformation are supported by AI and data literacy.

To fully realise the benefit of data and AI, change management is necessary, just like with any change in working practices. It is essential to create a cohesive and evolving plan. This can be based on three pillars: business strategy, operationalisation and architecture (after the technology barriers have been recognised).

Whether it’s a business strategy, data management, or organisational knowledge, it’s critical to assess the organisation’s level of maturity and data literacy.

Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses to deliver the dependability, strong governance, and performance of data warehouses while also allowing for the openness, flexibility and machine learning support of data lakes.

By removing the data silos that normally segregate and complicate data engineering, analytics, BI, data science and machine learning, this unified approach streamlines the current data stack. To increase flexibility, it is created using open standards and open-source software.

Additionally, its shared approach to data management, security and governance works more productively and develops more quickly.

In a global research effort in collaboration with an institution, Databricks polled 117 data leaders and the survey’s findings were illuminating and instructive.

An analytics leader’s biggest regret and issue was not embracing an open standards-based data architecture. “This didn’t surprise us. We are seeing many of our clients adopting the best open-source technologies,” Chris reveals.

In addition, the poll showed that only a small group can be successful with their AI projects, while the multi-cloud is a growing reality.

Most executives say they are currently evaluating or implementing a new data platform to address their current data challenges. During these challenging times, cloud technologies allow businesses to respond and scale rapidly.

With scalable data, analytics and AI strategy, organisations can create significant value. They can implement real-time monitoring, create tailored customer experiences, deploy predictive analytics, and much more. Databricks offers tools that are specifically designed to address the challenges described.

In Conversation With: The Future of Government Services and Shared Data

All the government agencies’ data must be protected and every component must be safeguarded. Unified data with analytics and AI makes it simpler to provide quick access for the organisation’s teams and complete support for security use cases.

Joseph Tan: At best, outdated legacy systems are a pain; at worst, they can seriously jeopardise security strategies

Joseph Tan, Deputy Director (Capability Development), Data Science & Artificial Intelligence Division, Government Technology Agency emphasised the importance of data modernisation with a holistic approach. A policy-driven industry that would entrust the organisations’ data will lead to better customer service.

Joseph is convinced that “As technology advances, most businesses are confronted with issues caused by an existing legacy system. Instead of providing companies with cutting-edge capabilities and services such as cloud computing and improved data integration, a legacy system keeps a business constrained.”

A legacy system is computer software or hardware that is no longer in use. The system still meets the needs for which it was originally designed, but it does not allow for expansion. Because a legacy system can only do what it does now for the company, it will never be able to interact with newer systems

“A business might keep using an old system for more than one reason. In the world of investments, for example, upgrading to a new system requires an initial investment of money and people, while keeping an old system running costs money over time,” Joseph explains.

On the other hand, when a whole company moves to a new system, there can be some internal resistance and worries about how hard it will be and what might go wrong. For example, legacy software might have been made with an old programming language, which makes it hard to find staff with the right skills to do the migration.

Additionally, there might not be much information about the system, and the people who made it might have left the company. It can be hard to just plan how to move data from an old system to a new one and figure out what needs the new system will have.

Increased security risk, instability and inefficiency, incompatibility with new technology, company perception and new hire training, single point of failure and lack of information are a few issues that older systems run against.

At best, outdated legacy systems are a pain, and at worst, they can seriously jeopardise an organisation’s overall IT security strategy. Furthermore, the longer a business waits to update a legacy system, the more challenging the transition will be.

System modernisation is almost always a must before digital transformation can occur. Most businesses won’t be able to fully profit from contemporary technologies and solutions without it. “With this, finding the right talent would be very beneficial for the organisation to manage their modern technologies,” says Chris.

Some advantages of updating legacy systems such as enterprises can enhance their IT security and sustain it by taking advantage of vendor upgrades and fixes in the future by updating legacy systems. Modern systems and solutions, including retrofitted legacy systems, are built to deliver optimal performance without consuming excessive amounts of computational power.

Even a legacy system may be modernised to include new features, giving the business users additional capability and a better user experience. The truth is that updated legacy systems require less input from IT staff, freeing them up to focus on activities that really benefit a company.

Similarly, governments all over the world will undergo a fundamental upheaval because of big data and artificial intelligence. Even though the public sector has long used data, the potential and actual use of big data applications have an impact on some theoretical and practical aspects of decision-making. This is fuelled by both the data revolution and the concurrent advancement of advanced analytics.

The availability of data that may be employed in the computer learning process is a major aspect of the maturing of AI technology and the practicality of AI applications to public policy and administration.

However, without the underlying analytical technologies, the data revolution can be seen as only a change in the size of the data that is currently available rather than a fundamental change. As predictive analytics, innovative data and artificial intelligence gain prominence, it is critical to understand their roles in the public sector.

At the start of their data journey, organisations require data capture systems to discover information embedded in all levels of business operations. Following that, the data must be validated for informational accuracy and integrated to reduce the risk of drawing incorrect conclusions and to create a unified view of the business.

The final step is analysis, in which businesses collaborate with data analysts who use cutting-edge analytics tools to peel back layers of proprietary data in search of insights to power change.

Larger companies with more complex data integration and analytics processes can add predictive analytics as the fourth step.

When analysing enormous datasets (often referred to as “big data”), predictive data analytics, also referred to as advanced analytics, uses autonomous or semi-autonomous algorithms to make predictions based on information patterns. Data analysts may provide clients with greater service, which can result in more meaningful transformations, by delivering deeper insights into company data more quickly.

Think about how AI and machine learning might be used in the context of the data processing flow. Analytics tools assist data analysts in identifying areas for improvement in the business after private data has been collected, analysed and combined into a single view.

AI excels at discovering data patterns that humans cannot perceive. This is quickly scalable based on the amount of the dataset. To make data analytics frictionless, machine learning algorithms can also adapt to data pipeline input and human behaviour patterns. This can be accomplished by utilising natural language processing to recode communications between individuals within an organisation so that algorithms can comprehend and act on them.

Artificial intelligence and machine learning have become the “next big thing” in the government sector, while advanced analytics, also known as predictive data analytics, utilises autonomous or semi-autonomous algorithms to evaluate enormous datasets and generate predictions based on information patterns.

By developing deeper insights into company data more quickly, data analysts can provide better service to clients, which can result in more profound transformations. Consider the application of AI and machine learning to the data handling process. After unique data has been collected, analysed and consolidated into a single view, analytics tools assist data analysts in identifying areas for business development.

Smart solutions enable advances that are self-sustaining and AI and ML are at the heart of these. Executives and practitioners agree that AI and ML are catalysts and drivers across both the public and private sectors. As an AI system has a deeper understanding of data platforms and processes, it can continue to enhance its efficacy and capacity to provide personalised insights from massive data silos.

Conclusion

In closing, Chris shared that Databricks was established in 2013 to assist data teams in resolving the most challenging issues facing the globe, and they have been investing in the Asia Pacific region to help this objective forward. “While there are countless possibilities, there are several challenges as well.”

It is insufficient to merely fund and use AI technologies. Businesses and organisations need a talent pool of experts that can use these AI tools in a way that can guarantee the greatest outcomes.

Currently, customers from a wide spectrum of businesses are collaborating with Databricks to tailor their clients’ experiences to improve their capacity to react to market dynamics and safeguard both their own and all stakeholders’ interests. This is most evident in real-time for financial services organisations to help deal with fraud.

“My particular favourite is Databricks’ assistance in Mitsubishi Tanabe’s efforts to quicken drug clinical trials in Japan. The possibilities for our collaboration are virtually endless,” Chris reflects.

Mohit recognises that digital transformation is vital in today’s VUCA environment. What is essential is that industry and government collaborate and work together. For long-term success and sustainability, there have to be partnerships between the public and private sectors.

Strategic alliances gave businesses and government agencies a competitive edge. Partnerships are mutually beneficial, helping each other grow and get better. When people genuinely try to help each other, “it can help to get over certain weaknesses and be first movers in their field.”

Spread the love

Leave a Reply

Your email address will not be published.