Current Positions | Previous Experience | Education | Voluntary Activities
Uwe Korn is a Senior Data Scientist at the German RetailTec company Blue Yonder. His expertise is on building scalable architectures for machine learning services. Nowadays he focuses on the data engineering infrastructure that is needed to provide the building blocks to bring machine learning models into production. As part of his work to provide an efficient data interchange he became a core committer to the Apache Parquet and Apache Arrow projects.
since November 2014
since October 2016
Maintenaince of the project; community building and code contributions around Parquet integration, packaging setup, and Java interoperability.
since September 2016
Building the initial write path to have complete Parquet roundtrips possible in C++ and Python; mainentance and Apache Arrow integration.
September 2011 - August 2013
Algorithm implementation and performance tuning in Scala, C++ and SQL; quality testing and user-experiments with Python and Node.JS in the research area Outlier/Graph Mining.
April 2011 - July 2011
Tutoring students and correction of weekly assignments of the lecture “Algorithmen 1”.
October 2010 - February 2011
Tutoring students and correction of weekly assignments of the lecture “Grundbegriffe der Informatik” (“Basic Notions of Computer Science”).
November 2004 - September 2009
Tasks included the full range of dealing with data, starting with simple data entry; technical improvement of the data entry platform; adjusting code for data preprocessing as well as helping to build classifiers that then were deployed into a production environment. Furthermore, I participated in writing software in image processing on a CPU and with the first versions of CUDA on a GPU. This included experiences in the whole software lifecycle from initial proof of concepts to production-grade libraries and the setup of a matching CI system with performance tests.
Intern at the Department for Knowledge Management.
Graduated with Distinction.
Master thesis: Distributed calculation of similarity measures for very large graphs
Courses included: Uncertainty Modelling for Intelligent Systems, Statistical Pattern Recognition, Learning in Autonomous Systems, Computational Genomics and Bioinformatics Algorithms, Artificial Intelligence and Logic Programming, and Cloud Computing.
Graduated with 1.0.
Bachelor Thesis: Parameter-free Outlier-aware Clustering on Attributed Graphs (published as a reserach paper: Efficient Algorithms for a Robust Modularity-Driven Clustering of Attributed Graphs)
Courses included: Linear Algreba, Analysis, Algorithms & Data Structures, Operating Systems, Markov Chains, Cognitive Systems, Probability Theory, Programming Paradigms, Theoretical Foundations of Computer Science, Data Mining Paradigms and Methods for complex Datasets, and Algorithms for Planar Graphs.
2010 - 2013
Member of the house parliament and the team organising the bar and the beverage replenishment; Member and spokesperson of the self-organised network team/ISP “HaDiNet”; part of the developer team that built a network management software in Python (Django, LDAP, …) that managed finances, contracts, printer accounts and automated network routing for the 1000 habitants of the dormitory.
2006 - 2013
Member of the board on diocese level and part of the leadership team on local and regional level; supervisor and organiser of youth camps and weekly groups; took care of (financial) accounting and the web presence / mail server of the whole organisation.