Skip to main content


MonetDB autumn 2016 feature release outlook

The major feature release of June2016 is behind us, and now it's time to look ahead for the next one. The list of topics looked into and reported on Bugzilla covers the full range of functional SQL enhancements, quality assurance, kernel algorithms, code cleaning, and packaging. Too much to tackle all at once. Therefore, our prioritized list contains the following topics for consideration of inclusion in the next feature release.

MonetDB/Python Loader Functions

The primary purpose of a database is to store and manage data. Without data, a database is not very useful. As such, the first thing you will do when you launch a database is to load your data into the database. In MonetDB, the primary way of loading large amounts of data into your database is using the `COPY INTO` statement. Using the COPY INTO statement, you can quickly load large CSV files into your database.

DataFungi, from Rotting Data to Purified Information

Martin Kersten presented his idea to cope with the ever increasing big data growth in a keynote at the 32nd IEEE International Conference on Data Engineering, May 16-20, 2016, in Helsinki, Finland.

Continuous Integration of MonetDB and SKA using Jenkins

MonetDB is used for doing database research, but is also widely used in production for a large set of different applications. One of them is astronomy. The fact that MonetDB is a column store makes it very well suited for the large analytical workloads of modern astronomy. So it is not surprising that MonetDB was chosen to be part of the software pipeline of the LOFAR radio telescope. In the previous blogpost "Time-domain radio astronomy with MonetDB" by Bart Scheers, a more detailed explanation can be found.

Time-domain radio astronomy with MonetDB

The international low-frequency radio telescope LOFAR (located in The Netherlands) is one of the first telescopes to completely integrate observations with real-time computation and data storage facilities in its overall design. Signals received by thousands of antennas are locally pre-processed and digitised before they are transported over a 10Gb/s link to a remote supercomputer. There, the raw data is processed further, e.g., imaged, after which dedicated software pipelines pick up the calibrated data again to do their science.

Subscribe to Blogs