You are not logged in.
I feel like it would make more sense to have some sort of universal input-transform-output system. That way if you want to run an app that bisects two screens with different pixel densities, or play a windowed game that renders on your dGPU while your iGPU handles the desktop environment while streaming the game and your webcam to your subscribers while archiving to a h.265 file on your computer which later gets live transcoded to h.264 when being streamed to your cellphone, all of those functions use the same system to communicate with each other.
Sort of like if X11 and Handbrake had a baby together that learned to render webpages and application layouts
Break everything up into streams (video, audio, dynamics[such as html web pages or application layouts], interrupts[such as button, keyboard, mouse, and touchscreen inputs and system calls or timers]) or transformers (video transcoders, web rending engines, application layout systems, or desktop window managers) and input/output devices (files, network devices, displays or speakers, keyboard and mice)
Doing thing this way is compliant with the Linux spirit of one file per function, as instead of a monolithic configuration you have a series of subcomponents that are interchangeable and swappable.
So a standard scenario of watching a video on a webpage could be like:
While a single-purpose kiosk that solely runs a html interface could be set up as
with all unnecessary component skipped, as the rendering engine outputs to a minimalistic wrapper that converts the dynamic stream to a video stream for the monitor, and directly recieves the inputs from the touchscreen. No other graphical components are required, as the system is managed through the commandline.
Offline