DisCō - Visual DIScovery and COmmunication of complex time patterns in non regularly gathered multigranular and multivariate data


To fill these gaps within DisCō*), we aim to develop novel Visual Analytics methods to visually as well as computationally analyse multivariate, time-oriented data and information to discover new and unexpected trends, patterns, and relationships. The main goals of the intertwined visual and analytical methods are to ensure high usability and good control of the integrated mining techniques by applying intuitive visualizations and visual interfaces. 

* ... in Latin "disco" (inf. discere) means "I learn"


Project Partners: 

March 2007 - February 2010

DisCo The capabilities to generate and collect data and information have seen an explosive growth and overwhelm traditional methods of data analysis such as spreadsheets, ad-hoc queries, or simple visualizations. Exploring trends, patterns, and relationships are particularly important when dealing with large amounts of data. The human perceptual system is highly sophisticated and specifically suited to spot visual patterns. For this reason, visualization is successfully applied in aiding these tasks. But facing the huge volumes of data to be analysed today, applying purely visual techniques only is often not sufficient.

Time is an important data dimension that is common across many application domains. However, support for the analysis of time-oriented data and information is weak. The main reason is that time - in contrast to other quantitative data dimensions that are usually "flat” - has an inherent structure and distinct characteristics which increase its complexity dramatically and demand specialised methods in order to support proper analysis and visualisation. Especially, the problems imposed by the combination of multiple, heterogeneous data sources in real world scenarios push current techniques to their limits.


This project is supported by the program "FIT-IT Visual Computing" of the Federal Ministry of Transport, Innovation and Technology under grant no. 813388.