mix'n'hack - 5th Swiss Open Cultural Data Hackathon


The 2019 Edition of the Swiss Open Cultural Data Hackathon took place on September 6 & 8 september 2019 at the cultural center Les Arsenaux in Sion. This time, OpenGLAM CH and Museomix.ch worked together to combine the OpenGLAM hackathon format with the Museomix makeathon approach. The event was hosted by the Médiathèque Valais, the State Archives and the Cantonal Museums. Further partners included infoclio and the members of the Friends of OpenGLAM Network.

Back to the Greek Universe

Back to the Greek Universe is a web application that allows users to explore the ancient Greek model of the universe in virtual reality so that they can realize what detailed knowledge the Greeks had of the movement of the celestial bodies observable from the Earth's surface. The model is based on Claudius Ptolemy's work, which is characterized by the fact that it adopts a geo-centric view of the universe with the earth in the center.

Simulation of the Ptolemaic system of the universe

Back to the Greek Universe is a web application that allows users to explore the ancient Greek model of the universe in virtual reality so that they can realize what detailed knowledge the Greeks had of the movement of the celestial bodies observable from the Earth's surface. The model is based on Claudius Ptolemy's work, which is characterized by the fact that it adopts a geo-centric view of the universe with the earth in the center.

Ptolemy placed the planets in the following order:

  1. Moon
  2. Mercury
  3. Venus
  4. Sun
  5. Mars
  6. Jupiter
  7. Saturn
  8. Fixed stars

Renaissance woodcut illustrating the Ptolemaic sphere modelThe movements of the celestial bodies as they appear to earthlings are expressed as a series of superposed circular movements (see deferent and epicycle theory), characterized by varying radius and speed. The tabular values that serve as inputs to the model have been extracted from literature.

Demo Video

Claudius Ptolemy (~100-160 AD) was a Greek scientist working at the library of Alexandria. One of his most important works, the «Almagest», sums up the geographic, mathematical and astronomical knowledge of the time. It is the first outline of a coherent system of the universe in the history of mankind.

Back to the Greek Universe is a VR model that rebuilds Ptolemy’s system of the universe on a scale of 1/1 billion. The planets are 100 times larger, the earth rotates 100 times more slowly. The planet orbits periods are 1 million times faster than they would be according to Ptolemy’s calculations.

Back to the Greek Universe was coded and presented at the Swiss Open Cultural Data Hackathon/mix'n'hack 2019 in Sion, Switzerland, from Sept 6-8, 2019, by Thomas Weibel, Cédric Sievi, Pia Viviani and Beat Estermann.

Instructions

This is how to fly Ptolemy's virtual spaceship:

  • Point your smartphone camera towards the QR code, tap on the popup banner in order to launch into space.
  • Turn around and discover the ancient greek solar system. Follow the planets' epicyclic movements (see above).
  • Tap in order to travel through space, in any direction you like. Every single tap will teleport you roughly 18 million miles forward.
  • Back home: Point your device vertically down and tap in order to teleport back to earth.
  • Gods' view: Point your device vertically up and tap in order to overlook Ptolemy’s system of the universe from high above.

The cockpit on top is a time and distances display: The years and months indicator gives you an idea of how rapidly time goes by in the simulation, the miles indicator will always display your current distance from the earth center (in million nautical miles).

Data

The data used include 16th century prints of Ptolemy's main work, the Almagest (both in greek and latin) and high-resolution surface photos of the planets in Mercator projection. The photos are mapped onto rotating spheres by means of Mozilla's web VR framework A-Frame.

Earth
Earth map (public domain)

Moon
Moon map (public domain)

Mercury
Mercury map (public domain)

Venus
Venus map (public domain)

Sun
Sun map (public domain)

Mars
Mars map (public domain)

Jupiter
Jupiter map (public domain)

Saturn
Saturn map (public domain)


Stars map (milky way) (Creative Commons Attribution 4.0 International)

Primary literature

Secondary literature

Version history

2019/09/07 v1.0: Basic VR engine, interactive prototype

2019/09/08 v1.01: Cockpit with time and distance indicator

2019/09/13 v1.02: Space flight limited to stars sphere, minor bugfixes

2019/09/17 v1.03: Planet ecliptics adjusted

Team

http://make.opendata.ch/wiki/project:back_to_the_greek_universe

CoViMAS

Collaborative Virtual Museum for All Senses (CoViMAS) is an extended virtual museum which engages all senses of visitors. It is a substantial upgrade and expansion of our award-winning Glamhack 2018 project “Walking around the Globe” http://make.opendata.ch/wiki/project:virtual_3d_exhibition which had the DBIS Group from the University of Basel team up with the ETH Library to introduce a prototype of an exhibition in Virtual Reality.

Day One

CoViMAS joins forces of different disciplines to form a group which contains Maker, content provider, developer(s), communicator, designer and user experience expert. Having different backgrounds and expertise made a great opportunity to explore different ideas and opportunities to develop the horizons of the project.

Two vital components of this project is Virtual Reality headset and Datasets which are going to be used. HTC Vive Pro VR headsets are converted to wireless mode after our last experiment which prove the freedom of movement without wires attached to the user, increases the user experience and feasibility of usage.

Our team content provider and designer spent invaluable amount of time to search for the representative postcards and audio which can be integrated in the virtual space and have the potential to improve the virtual reality experience by adding extra senses. This includes selecting postcards which can be touched and seen in virtual and non-virtual reality. Additionally, to improve the experience, and idea of hearing a sound which is related to the picture being viewed popped up. This audio should have a correlation with the picture being viewed and recreate the sensation of the picture environment for the user in virtual world.

To integrate the modern methods of Image manipulation through artificial Intelligence, we tried using Deep Learning method to colorize the gray-scale images of the “otografien aus dem Wallis von Charles Rieder”. The colored images allow the visitor get a more sensible feeling of the pictures he/she is viewing. The initial implementation of the algorithm showed the challenges we face, for example the faded parts of the pictures or scratched images could not very well be colorized.

img_20190908_112033_1_.jpg

Day Two

Although the VR exhibition is taken from our previous participation in Glamhack2018, the exhibition needed to be adjusted to the new content. We have designed the rooms to showcase the dataset “Postkarten aus dem Wallis (1890-1950)”. at this point, the selected postcards to be enriched with additional senses are sent to the Fablab, to create a haptic card and a feather pallet which is needed to be used alongside one postcard which represent a goose.

the fabricated elements of our exhibition get attached to a tracker which can be seen through VR glasses and this allows the user to be aware of location of the object, and sense it.

The Colorization improved through the day, by some alteration in training setup and the parameters used to tune the images. The results at this stage are relatively good.

And the VR exhibition hall is adjusted to be automatically load images from postcards and also the colorized images alongside their original color.

And late night, when finalizing the works for the next day, most of our stickers have changed status from “Implementation” phase to “Done” Phase!

Day Three

CoViMAS is getting to the final stage in the last day. The Room design is finished and the location of the images on the wall are determined. The tracker location is updated in the VR environment to represent the real location of the object. With this improvement the postcard can be touched while being viewed simultaneously.

Data

Team

http://make.opendata.ch/wiki/project:covimas

Opera Forever

Opera Forever is an online collaboration platform and social networking site to collectively explore large amounts of opera recordings.

The platform allows users to tag audio sequences with various types of semantics, such as personal preference, emotional reaction, specific musical features, technical issues, etc. Through the analysis of personal preference and/or emotional reaction to specific audio sequences, a characterization of personal listening tastes will be possible, and people with similar (or very dissimilar) tastes can be matched. The platform will also contain a recommendation system based on preference information and/or keyword search.

Background: The Bern University of the Arts has inherited a large collection of about 15'000 hours of bootleg live opera recordings. Most of these recordings are unique, and many individual recordings rather long (up to 3-4 hours), hence the idea of segmenting the recordings so as to allow for the creation of semantical links between segments to enhance the possibilities of collectively exploring the collection.

Core Idea: Users engaging in “active” listening leave semantic traces behind that can be used as a resource to guide further exploration of the collection, both by themselves and by third parties. The approach can be used for an entire spectrum of users, ranging from occasional opera listeners, through opera amateurs, to interpretation researchers. The tool can be used as a collaborative tagging platform among research teams or within citizen science settings. By putting the focus on the listeners and their personal reaction to the audio segments, the perspective of analysis can be switched to the user, e.g. by creating typologies or clusterings of listening tastes or by using the approach for match-making in social settings.

TimeGazer

Welcome to TimeGazer: A time-traveling photo booth enabling you to send greetings from historical postcards.

Welcome to TimeGazer: A time-traveling photo booth enabling you to send greetings from historical postcards.

Based on the wonderful “Postcards from Valais (1890 - 1950)” dataset, consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth.

Choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard.

Photobomb a historical postcard

A photo booth for time traveling

send greetings from the poster

virtually enter the historical postcard


Mockup of the process.

Based on the wonderful “Postcards from Valais (1890 - 1950)” dataset,
consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth.
One can choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard.

Potentially with VR-trackerified things to add choosable objects virtually into the scene.

Technology

This project is roughly based on a project from last year, which resulted in an active research project at Databases and Information Systems group of the University of Basel: VIRTUE.
Hence, we use a similar setup:

Results

Project

Blue Screen

Printer box

Standard box on MakerCase:

Modified for the input of paper and output of postcard:

The SVG and DXF box project files.

Data

Quote from the data introduction page:

A collection of 3900 postcards from Valais. Some highlights are churches, cable cars, landscapes and traditional costumes.

Source: Musées cantonaux du Valais – Musée d’histoire

Team

  • Dr. Ivan Giangreco
  • Dr. Johann Roduit
  • Lionel Walter
  • Loris Sauter
  • Luca Palli
  • Ralph Gasser