Can Disaster Recovery and Backup Deployed As 1-Click Modular Systems

Overview

Planned or unplanned, infrastructure and application downtime can come at any time, from any direction, and in any form. The ability to keep an organization operational during a technology outage, facility destruction, loss of personnel, or loss of critical third party services is critical to preventing irreversible damage to a business. 

With the increasing global shift to e-commerce models and their reliance on 24/7 application uptime, high availability (HA) and disaster recovery (DR) impact the financial health of organizations.

High Availability Versus Disaster Recovery

While they both increase overall availability, the notable difference is that with HA there is generally no loss of service. HA retains the service and DR retains the data, but with DR, there is usually a slight loss of service while the DR plan executes and the system restores.

  • High availability (HA) – The measure of a system’s ability to remain accessible in the event of a system component failure. Generally, HA is implemented by building in multiple levels of fault tolerance and/or load balancing capabilities into a system.

  • Disaster recovery (DR) – The process by which a system is restored to a previous acceptable state, after a natural (flooding, tornadoes, earthquakes, fires, etc.) or man-made (power failures, server failures, misconfigurations, etc.) disaster.

High Availability Options

You can achieve high availability through the use of clustering and/or load balancing of the nodes. Depending on the defined SLA, four HA options are possible with Mule:

  • Cold Standby
  • Warm Standby
  • Hot Standby – Active-Passive
  • Active-Active

Conclusion

The performance information of all managed devices can be managed
and displayed. Both real-time and historical performance and load information can be accessed and evaluated using easy to understand
graphics.


By setting threshold values all critical performance parameters can be
monitored, triggering local or remote alarms in case of exceeding. Critical components then can be identified with a few clicks helping to
take immediate action

Software Developed by a Third-Party: A Comparative Case Study

Overview

Nowadays, application development for smart devices is an evolving field with great economic and scientific interest. According to Gartner [9] the total number of mobile app store downloads worldwide will increase to 81 billion in 2013, paid downloads will surpass 8 billion and free downloads 73 billion. With the currently increasing number of mobile platforms, developing mobile applications became very difficult for companies as they need to develop the same applications for each target platform.

The typical process of developing native applications is the appropriate way of deploying mobile apps but has one major disadvantage: it is not possible to reuse the source code for another platform; the very same app must be redeveloped from scratch..

Native apps are developed using an Integrated Development Environment (IDE) that provides the necessary development tools for building and debugging the applications. Native apps are more difficult to develop and require a high level of experience and technological know-how than other types of applications.

The Comparative Review

The unexpected growth in the mobile market motivated the implementation of cross-platform software development environments that could make the development easier and more efficient. The main categories of applications produced by these software environments are web, hybrid, interpreted and generated apps.

None of the approaches is neither prevalent nor the best solution to the problem of developing cross-platform mobile applications. There are many other development environments which can be classified into intermediate categories. For example, the commercial software IBM Worklight subdivides the category of hybrid apps into two subcategories: the hybrid web applications and mixed hybrid applications [20].

According to IBM, the source code of hybrid web applications consists exclusively of HTML5, and is executed through a web browser, while the mixed hybrid applications can execute native code calls through a native API.

Comparative Analysis of the Software

In this section a comparative analysis of the aforementioned cross-platform mobile app development approaches is presented based on a set of characteristics converging with the set of criteria proposed in [16]. In this paper the authors proposed the specific set of criteria for assessing cross-platform development approaches and utilized them for comparing frameworks that bridge the gap between web and mobile systems, like PhoneGap and Titanium Mobile.

The criteria we use for the comparative analysis of cross-platform development approaches are the following:

  • Market place deployment (distribution). Evaluates whether and how easy it is to deploy apps to the app stores of mobile platforms, like Google Play or Apple’s iTunes.
  • Widespread technologies: Evaluates whether apps can be created using widespread technologies, such as JavaScript.
  • Hardware and data access: Evaluates whether apps have no access, limited or full access to the underlying device hardware and data.
  • User interface and look and feel: Evaluates whether apps inherently support native user interface components or native user interface and look and feel is simulated through libraries, such as JQuery Mobile.

Conclusion

Technology has come a long way since then, and the variety of the information objects we’re managing has changed a lot, but one tenet has remained constant we’ve always focused on the intersection of people, processes, and information. As the Association for Intelligent Information Management, we help organizations put their information to work.

Banking Upheaval – Leading Banks Are Going the Cloud Way

Overview

New technology can only be as good as its implementation. Digital platforms can position banks so that they are able to immediately respond to consumer needs. However, financial institutions must approach this ‘digital reprogramming’ in a manner that is consciously designed to proactively exploit technology to its full potential. 

Developing and designing scalable digital platforms for innovation is no mean feat, and a successful shift to highly flexible infrastructure demands a team that understands the power of available technology.

What is Self-disruption in the Banking?

According to Catherine Zhou, HSBC’s global head of ventures, digital innovation and partnerships, self-disruption is about reinventing part of your existing business. Sometimes this is done incrementally by improving existing processes, sometimes this is through radically challenging the status quo and doing things completely differently.

HSBC started with the concept that the bank could make peer-to-peer payments better for customers in Hong Kong, challenging existing banking payment models. Zhou explains: “In house, while maintaining our existing digital payments channels, we built a new social payments app called PayMe which is faster and more engaging. It now has the largest market share, with more than 2.5 million users, and we have expanded it to merchants through PayMe for Business.”

The Dearth In Creative Talent

As is natural in highly regulated industries like banking, there are limits as to how far concepts such as self-disruption can be pushed.

Bright believes that there is a balance that must be struck, and while not all areas warrant disruption, this balance can be found through a system of controls and regulation with innovation – “This is real state of the art innovation in action.”

“When it comes to highly-regulated environments, it is better to ask permission beforehand than beg forgiveness afterwards. We therefore work closely and collaboratively with official institutions, including central banks and regulators, when exploring potentially disruptive innovations,” states Bright.

Conclusion

Zhou concurs, noting that cloud offers financial services institutions the ability to architect for resilience in a way that is not possible for on-premises technology. For example, the ability to use cutting-edge data science and machine learning, delivering improved functionality, and increased automation.

What’s more, Newcomer argues, is that it is possible to achieve a stronger security posture in the cloud than on-premises. One reason for this is that when migrating to the cloud banks ensure that every control mechanism is meticulously put in place.

Why Hospitals Are Surging Towards AI-bots for Children With Cancer

Overview

Big data and machine learning are influencing almost every aspect of modern life, including entertainment, commerce, and healthcare. Netflix knows which films and series people like to watch, Amazon knows which items people like to buy when and where, and Google knows which symptoms and conditions people look up.

All of this information can be used to create highly detailed personal profiles, which can be useful not only for behavioral understanding and targeting, but also for predicting healthcare trends. There is great hope that the application of artificial intelligence (AI) will result in significant improvements in all areas of healthcare, from diagnostics to treatment. It is widely assumed that AI tools will facilitate and enhance human work rather than replace it.

Technological Advancements

There have been a great number of technological advances within the field of AI and data science in the past decade. Although research in AI for various applications has been ongoing for several decades, the current wave of AI hype is different from the previous ones. A perfect combination of increased computer processing speed, larger data collection data libraries, and a large AI talent pool has enabled rapid development of AI tools and technology, also within healthcare . This is set to make a paradigm shift in the level of AI technology and its adoption and impact on society.

In particular, the development of deep learning (DL) has had an impact on the way we look at AI tools today and is the reason for much of the recent excitement surrounding AI applications. DL allows finding correlations that were too complex to render using previous machine learning algorithms. This is largely based on artificial neural networks and compared with earlier neural networks, which only had 3–5 layers of connections, DL networks have more than 10 layers. This corresponds to simulation of artificial neurons in the order of millions.

There are numerous companies that are frontrunners in this area, including IBM Watson and Google’s Deep Mind. These companies have shown that their AI can beat humans in selected tasks and activities including chess, Go, and other games. Both IBM Watson and Google’s Deep Mind are currently being used for many healthcare-related applications. IBM Watson is being used to investigate for diabetes management, advanced cancer care and modeling, and drug discovery, but has yet to show clinical value to the patients. Deep Mind is also being looked at for applications including mobile medical assistant, diagnostics based on medical imaging, and prediction of patient deterioration .

Precision Medicine

It is believed that within the next decade a large part of the global population will be offered full genome sequencing either at birth or in adult life. Such genome sequencing is estimated to take up 100–150 GB of data and will allow a great tool for precision medicine. Interfacing the genomic and phenotype information is still ongoing. The current clinical system would need a redesign to be able to use such genomics data and the benefits hereof .

Deep Genomics, a Healthtech company, is looking at identifying patterns in the vast genetic dataset as well as EMRs, in order to link the two with regard to disease markers. This company uses these correlations to identify therapeutics targets, either existing therapeutic targets or new therapeutic candidates with the purpose of developing individualized genetic medicines. They use AI in every step of their drug discovery and development process including target discovery, lead optimization, toxicity assessment, and innovative trial design.

Many inherited diseases result in symptoms without a specific diagnosis and while interpreting whole genome data is still challenging due to the many genetic profiles. Precision medicine can allow methods to improve identification of genetic mutations based on full genome sequencing and the use of AI.

Solution

We believe that AI has an important role to play in the healthcare offerings of the future. In the form of machine learning, it is the primary capability behind the development of precision medicine, widely agreed to be a sorely needed advance in care. Although early efforts at providing diagnosis and treatment recommendations have proven challenging, we expect that AI will ultimately master that domain as well. Given the rapid advances in AI for imaging analysis, it seems likely that most radiology and pathology images will be examined at some point by a machine. Speech and text recognition are already employed for tasks like patient communication and capture of clinical notes, and their usage will increase.