|phone:||+31 (0)45 576 2143|
This page lists projects I currently supervise. Students that finished their projects and graduated are listed on the page of supervised theses. A LaTeX template for OU theses has been kindly provided by Annet Vink and Katleen de Nil (based on the work by Niels Tielenburg).
Data on the web is often volatile: prices you see in a webshop today might have changed tomorrow. In this project, we investigate various aspects of online tracking and build tools to help users track websites themselves.
The project focuses on using scraping technology to investigate security and privacy on the web. Examples of such investigations include investigating price differentiation and website login security.
Crime has more and more ties to the digital world. The domain of digital forensics focuses on investigating and preserving digital evidence. However, three trends combine to make this very hard in the future: (i) data carriers are increasing in size, meaning there is orders of magnitude more data to sift through; (ii) more and more items are becoming data carriers, meaning many different items may need to be investigateed; (iii) the ever-increasing diversity in apps means that there is an ever-increasing diversity in file formats where evidence may be stored.
This project focuses on improving and generalising techniques for recovering deleted files, in order to preserve this important line of digital forensics for future cases.
Vlot et al. showed that over 10% of websites identify scrapers. This may cause the site to provide a different view. For example, Vlot et al. found that websites left out advertisements for scrapers. This may cause scientific studies based on scrapers to report incorrect data. The goal of this project is to design and implement a "stealth" version of a scraper, one that is not detected by current detection algorithms.
Browser fingerprinting has become a focus of current research. However, an up-to-date view of how commercial fingerprinters work is lacking. The goal of this project is to identify current commercial fingerprinters in wide-spread use and analyse their fingerprinting strategies.
For his thesis, Robbert Noordzij created a simulator to measure file fragmentation on virtual machines. He strove to achieve a realistic simulation, that is: one approaching how a regular human would interact with a computer. To that end, his simulator uses a natural typing rhythm based on data on user typing rhythms. The typing was done in Word.
In this project, we take this one step further. We extend his simulator to install a popular open source integrated development environment (IDE), and then use the IDE to replay the development of an open source project, by replaying the publicly available git commits of that project.
The work by Gabry Vlot found that over 10% of websites detect webbots. This means that any scraping study should either account for this or take measures against this. The goal of this project is to create a scraper, that can successfully defeat the detection techniques found by Vlot.
Several studies found that users are presented with different prices under certain conditions. The goal of this project is to create a framework to enable simultaneous structured testing of price differentiation on multiple platforms (apps, websites).
Profiling website visitors is widely used in the advertisement industry. The goal of this project is to investigate the extent to which similar profiling is used by malware authors. That is: does browsing the internet with a more vulnerable fingerprint induce more malware?
In a recent study, Jonker & Ferreira Torres looked for
evidence of fingerprinting activities in Android apps. To this
end, they built a crude detector. While the detector was
sufficient to support them in uncovering fingerprinting-alike
behaviour, its crudeness caused too many false positives and
(presumably) false negatives. As such, this detector could not be
used to quantify the extent to which fingerprinting-alike
behaviour is prevalent in Android apps.
The goal of this project is to create a scanner that improves upon this.
Smartphones have a wide array of sensors and antennas. The goal of this project is to use these to determine and classify the smartphone's context. This context classification can then be used to enforce privacy, e.g. by making context-dependent encryption of smart phone photos.