This announcement is for a position of postdoc within AI and formal verification in systems with dependability requirements at Linköping University. It is financed within a Singapore-Sweden collaboration scheme with the Wallenberg AI, Autonomous Systems and Software Program (WASP). The candidate will contribute to methods for analysis of systems with machine learning components that are used in decision making, and generation of explanations that on the basis of collected data are acceptable within a safety-critical system (automotive, avionics). The project will be organised in collaboration with researchers at Chalmers (Professor Carl Seger) and Singapore NanYang Technological University (Professor Liu Yang) with the intention to creation symbolic abstractions of training data and classifications obtained by deep learning algorithms. The abstractions will then be the basis of formal verification of system dependability and metamorphic testing of the resulting system. At LiU the postdoc will collaborate with an active PhD student in the lab who works with safety-critical avionic applications.
We offer an internationally renowned research environment with excellent facilities and a wide network of collaboration in industry and academia in Sweden. The research environment at the real-time systems Laboratory includes several PhD students working in the areas of critical infrastructure security, system modelling and verification, and future networks load management and security (edge, 5G, vehicular). A postdoc will benefit from our wide network of collaborators, friendly atmosphere, and strengthen the excellent research ongoing in the lab in collaboration with existing and forthcoming PhD students. We welcome candidates who are looking for a future career in academia or public/private sector research positions in Sweden and looking to grow as a researcher in our environment. The candidate is given the chance of, and training in contribution towards acquiring research grants by the senior faculty members.