Health Management in a geographic context -  GEOHEALTH

The GEOHEALTH solution proposes to provide more efficient management of health problems by resolving them in their natural geographical context, which permits a more penetrating analysis of the existing patient pathologies as they are distributed by geographic location.

for both

- BETTER patient guidance or orientation

- and for BETTER health-care infrastructure and amenities management

Idea

The idea is to help better match the supply to the demand for health services and not only, because the solution aims to provide a map of the health services required and their intensity necessary to respond in a geographically distributed manner as the health requirements of the population dictate.

At its most global level the data recorded may be used by health authorities to plan, on the basis of current demand its infrastructure offering in terms of facilities(buildings etc) and personnel dislocated (distributed) on the territory. Not only, the same data viewed on historical accumulation of the data, combined with Machine learning and Artificial Intelligence techniques will enable prediction of future demands and requirements and help plan a forward-looking infrastructure for the medical service.

At a more local and individual user usage of the data, the patient may more readily find (through an appropriate app) the nearest infrastructure capable of responding to his own individual health needs, location of nearest hospitals, nearest medical staff, and even find nearby patients with a similar pathology, if this may be of interest to him.

Likewise the health institution may be able to keep track of its enrolled patients and where geographically they are located, and possibly facilitate their patients with appropriate means of transport (ambulance service etc) and use distance-location of patient in organizing visits and fixing appointments for medical exams etc. By tracking their individual pathologies as they are distributed in the land, the health institution may also better dislocate logistically its resources so as to make them best available to where the utmost demand is located in the territory.

Using mapping techniques and cartography of possible factors for onsurge of pathology or simply of surrounding environmental factors, application of GIS technology, and maybe some machine learning coupled with Artificial Intelligence techniques, it should be possible to identify with major clarity the most important causes, or contributing factors, of the given pathology .

With the application of further artificial Intelligence applied on larger quantities of data (Big Data) especially on historically accumulated data it may even be possible to extract Predictive Behaviour for the onset of a given pathology. In time, a model may be developed that is able to describe the full mechanism necessary for the onset of the given pathology, ie the concomitant factors that have to be present to give rise to the onset of the pathology. So a system may be set up whereby an automatism is capable of identifying beforehand geographic hot-spots where there is high risk of incurring in a given pathology in the future, and as such keep them under close surveillance, monitoring them with extra care, and perhaps even intervening with pre-cautionary measures (such as medical intervention and health-prevention campaigns) to try to mitigate that risk on a priority and forward-looking basis.

In a very banale and restricted approach, such a system should at least be capable of ensuring that enough medical resources are made strategically available to combat the future needs that it has forecasted.

Implementation

The solution proposed, for its implementation, relies on cross-referencing patients’ residential addresses, their pathologies and the health-care infrastructures (buildings, medical staff and resources in general) available. All this data is stored in various databases in the Cloud, to reap the maximum advantage of reduced costs and agile scaling without downtime.

Included in this vast amount of data is also the data which results from the action of constant monitoring of the customer (patient) satisfaction (assessed, either by soliciting from the patients the compilation of questionnaires, or more automatically and unobtrusively by comparing over time the performance of the health facilities offered and the efficacy of the cure provided for the pathology of the patient (naturally for equivalent pathology and patient) in terms of diminished times, reduced costs, more readily available help at nearer and more accessible locations and the validity of the cure provided, in terms of keeping as low as possible the percentage gravity of relapse of the same pathology in the same patient.

These quality/performance parameters all being silently monitored and stored in databases in the cloud, along with all the others, allow the Artificial intelligence component of the system to elaborate an indicator of quality of the health service given, and by monitoring this over time assure that it is constantly growing, or at least prevent it falling below levels of performance already attained.

The technology

BigData Ingestion – ETL to predispose for Machine Learning

Centralized databases in the Cloud

Data flow ingestion in streaming form  from multiple sources (patients’ medical records, healthcare infrastructure (institutes and medical staff records and facility and personnel management databases). Also Patient pathology records are ingested)

Machine learning and Artificial Intelligence Inference  provide elaboration of model

Scoring  gives indication of degree of match between reality and the idealised model. This Allows deductions to be made and the system can generate advice on how to proceed.

Re-iterations (several times over) are used to refine the model developed to address the different problems to be resolved (on same data)

 

Deployment

Finally the deployment phase is reached and if Artificial Intelligence has worked in providing a model for the solution of a single specific problem analyzed, then when the model is executed on large historically accumulated data, and therefore further refined, the model becomes more and more reliable. When the model is sufficiently tested this may be run as an automatism and its indications followed as a matter of routine. In fact such a model will permit the development of a suitable application that can even MAKE PREDICTIONS of future reality and suggest more long-term solutions.