BSc Cybercrime & IT Security · SETU · 2026

Vulnerabilities in
Modern Proctoring Software

An investigation into virtual machine detection evasion and virtual camera injection attacks in commercial online examination platforms.

Author Noel Sebi
Institution South East Technological University
Submitted April 2026

What this research examines

Online proctoring platforms are increasingly used by academic institutions to invigilate remote examinations. This project evaluates whether these platforms can reliably detect two key evasion techniques: running the examination inside a virtual machine to conceal host-side activity, and injecting a pre-recorded video feed in place of a live webcam.

Two commercial proctoring vendors — Vendor A and Vendor B — were tested across a range of configurations using both VMware Workstation 17 on a Windows 11 desktop and Oracle VirtualBox 7.2 on a Linux (Ubuntu 24.04 LTS) laptop, with Windows 10 guest VMs in all cases. Vendor A was selected based on its willingness to participate and its claims of AI-powered monitoring.

2
Vendors tested
4
Attack vectors
7
VM evasion configs
2
Hypervisors

Results across all test configurations

Vendor A
VM Detection
Industry Leading

Vendor A successfully detected virtualised environments across all seven evasion configurations tested — including full combined guest-and-host patching on both VMware and VirtualBox. Its layered approach combines SMBIOS signal checking, registry enumeration, CPUID bit inspection, and timing-based heuristics, representing a robust defence-in-depth strategy. Note: exact detection outcomes are educated guesses, as Vendor A has not disclosed its detection methods.

  • No evasion (VMware & VirtualBox) — Detected
  • Guest script only — Detected (CPUID flag)
  • Guest + host VMX patch — Detected (timing heuristic)
  • Full combined patch — Detected (behavioural heuristic)
Vendor A
Virtual Camera
Not Detected

Despite its strong VM detection, Vendor A failed to detect virtual camera injection. A pre-recorded video loop served via OBS Studio Virtual Camera was accepted without any flag. The platform does not differentiate between physical and virtual camera devices at the DirectShow API level — a fundamental architectural limitation shared across the industry.

Vendor B
VM Detection
Failed

Vendor B failed VM detection in every configuration tested, including the unmodified baseline where the WMI system manufacturer string still read "VMware, Inc." and the VMware MAC address OUI 00:0C:29 was present. The session completed with a low risk score in all cases, suggesting VM detection was not part of the platform's threat model.

Vendor B
Virtual Camera
Failed

Vendor B also failed to detect virtual camera injection. In one test run, the candidate was absent from the virtual camera view for over six minutes with no real-time alert generated, suggesting monitoring operates on a delayed review basis or has significantly lower face-presence sensitivity than Vendor A.

Vendor B
Edge Browser Compatibility
Issue Found

Vendor B's session establishment failed consistently on Microsoft Edge. The session would initialise, pass the system check, and then fail at the camera permission dialogue. The issue was reproducible across three attempts on two Windows 10 configurations and resolved only when switching to Google Chrome — suggesting a compatibility issue between Vendor B's WebRTC implementation and the installed Edge version.

Comparative Results Matrix

Attack Vector Vendor A Vendor B Notes
VM detection — VMware, Windows host, no evasion Detected Not Detected Vendor B has no baseline VM check
VM detection — VirtualBox, Linux host, no evasion Detected Not Detected Consistent across hypervisors
VM detection — VMware, guest script only Detected Not Detected CPUID signal persists for Vendor A
VM detection — VMware, full combined patch Detected Not Detected Timing heuristic triggers for Vendor A
Virtual camera — pre-recorded loop Not Detected Not Detected Both vendors lack liveness detection
Virtual camera — OBS with live feed + overlay Not Detected Not Detected Overlay injection undetected
Perspective evasion — angle tilt Partial Not Detected Vendor A flags gaze deviation
Secondary device — phone at frame edge Not Detected Not Detected Both vendors miss peripheral device

How testing was conducted

01

Test Environment

Two physical host machines: a Windows 11 desktop running VMware Workstation 17 (Host A), and a Linux (Ubuntu 24.04 LTS) laptop running Oracle VirtualBox 7.2 (Host B). Both hosted Windows 10 guest VMs configured with 8 GB RAM and 4 vCPUs.

02

VM Cloaking Tools

A custom PowerShell script (Invoke-BareMetalCloak.ps1) was developed to modify guest-side VM fingerprints: registry keys, WMI classes, service names, disk model strings, and MAC address OUI. A companion Bash script applied host-side VBoxManage DMI overrides and CPUID configuration.

03

Virtual Camera Injection

OBS Studio 30.x with the Virtual Camera plugin was used to substitute a pre-recorded video loop in place of the physical OBSBOT Meet 2 webcam. The OBS virtual camera's registry FriendlyName was changed to "4K Webcam" to avoid name-based detection. The virtual camera registered as a standard DirectShow device indistinguishable from hardware at the OS level.

04

Controlled Test Protocol

Each vendor's sandbox environment was used with a dummy examination account. All testing was performed in isolated environments with no real examination data or live student credentials involved at any stage.

Why these findings matter

Vendor B's VM gap is trivially exploitable

Any candidate with VirtualBox installed can run an examination in a completely unmodified VM and gain unrestricted host access throughout the session. No special tools or privileges beyond installation are required.

Virtual camera is an industry-wide architectural gap

Windows and Linux do not differentiate between physical and virtual cameras at the DirectShow/Media Foundation API level. Without hardware attestation or active liveness detection, no platform can reliably detect camera substitution using current OS primitives.

Vendor A sets the benchmark

Vendor A's layered defence — combining multiple independent detection signals — means defeating any single layer is insufficient. This defence-in-depth approach represents the standard other platforms should be measured against.

Institutions should independently evaluate platforms

Marketing claims of "AI-powered monitoring" cannot be taken at face value. Independent technical evaluation — of the kind conducted in this project — is necessary before deploying proctoring platforms for high-stakes assessments.

About the project

This research was conducted as a Final Year Project for the Bachelor of Science in Cybercrime and IT Security at South East Technological University (SETU), Ireland.

All testing was performed within isolated sandbox environments using dedicated test accounts. No real examinations, live student data, or production institutional systems were accessed or compromised at any stage. Vendor names are anonymised pending responsible disclosure.

Virtual Machine Forensics Anti-Detection Techniques Webcam Security PowerShell Automation VirtualBox / VMware OBS Studio DirectShow API Academic Integrity

Full Project Report

26-page technical report covering methodology, implementation, results, security implications, and conclusions.

Download Report (.docx)

Source Code

Scripts, tools, and documentation available on GitHub.

View on GitHub