- new
- past
- show
- ask
- show
- jobs
- submit
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
https://devblogs.microsoft.com/directx/introducing-advanced-...
Vulkan support was introduced in OBS Studio 25.0 in March 2020, 5.5 years ago.
Metal DOES... but only apple hardware.
It is not officially supported on Windows, it works because the GPU vendors use the Installable Client Driver API, to bring their own driver stack. This was initially created for OpenGL, and nowadays sits on top of the DirectX runtime.
On the embedded space, the OSes that support graphical output many are stil focused on OpenGL ES.
> Metal takes Direct3D's object-oriented approach one step further by combining it with the more "verbal" API design common in Objective-C and Swift in an attempt to provide a more intuitive and easier API for app developers to use (and not just game developers) and to further motivate those to integrate more 3D and general GPU functionality into their apps.
slightly off-topic perhaps, but i find it amazing that an os-level 3d graphics api can be built in such a dynamic language as objective-c; i think it really goes to show how much optimization put in `objc_msgSend()`... it does a lot of heavy lifting in the whole os.In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
What you are talking about are C++ wrappers around Metal Objective-C API. Yes, it is weird as they are going C++ -> Objective-C -> C++. Why not go directly? Because Apple does not ship C++ systems frameworks.
The term is Objective-C++.
Not sure why though, because Metal 3 is still supported on a bunch of Intel Macs...
AAA titles with newer graphics, well, you can always send a capture the PC with the nvidia card's screen through a capture card.
Back in my days of streaming, macOS was no option, cca. 2017. Today I'd do it with any M processor mac without a second thought.
Worked reasonably well (you can send camera/VTuber output and captured video from game and any overlays separately, or just use the setup in a similar way to a capture card and run ONLY the game on the gaming PC and everything else on the Mac), but added some complexity to it all.
A beefy Nvidia GPU would make that setup not necessary, unless you want to directly play games on the Mac.
Occasionally, I will show 3 things at once: an MP4 that the Mac Mini plays from its storage transitioning into captured hdmi signal from a canon camera as picture-in-picture with the main body of the stream containing captured hdmi output from my development laptop.
I'm not sure what my capture solution will be, but it seems there are a wide variety of USB-C capture adapters that I could use that are compatible with OBS on Mac and are even bus powered.
Other comments seem to indicate there are bugs in that specific picture-in-picture setup, but I'm sure those will get ironed out.
I hope the next version actually works in some facility.
Turning off nearly everything iCloud- or Spotlight-related is a pretty good start; disable network access and you may find even more pearls of wisdom.
- recording your screen but not streaming
- you are not customizing what goes into your screen
Then use something else. GPU screen recorder has a lower overhead and produces much smoother recordings: https://git.dec05eba.com/gpu-screen-recorder/about/
OBS is great, it was my go to recording tool. But my videos were choppy until I started using GPU screen recorder.
> If you are running another distro then you can run sudo ./install.sh, but you need to manually install the dependencies, as described below.
And then I just skimmed the rest, because I assumed it would be about manual dependency installation which I am not interested in.
Odd that the easiest installation method listed on line 3 was not the first line in the installation text, that's not a great DX.
Also - I'm on MacOS, and the OBS blog updated I shared was for MacOS.
A famously missing macOS feature. Loopback is yonder: https://rogueamoeba.com/loopback/
> the shortcut to stop screen recording on QuickTime sucks, it’s like CMD+CTRL+ESC
I just stop it from the menu bar, then on the resultant video press Cmd-T (trim) to lop off that footage.
Edit: I think you might have skipped reading the post. It's about OBS on MacOS. Where quicktime exists. Your suggestion seems geared toward Linux.
Lower framerate doesn't really decrease video size because of how videos work, but you can set bitrate quality for the recorded video to reduce the video quality a bit to decrease the size.
“OBS Studio Gets A New Renderer: How OBS Adopted Metal”
And i call it great music.
But they’ve clearly learned a lot that will help in the future with other modern APIs like DX12 or Vulcan.
That's besides the point though, the OS has been trash for realtime encoding for over a decade now. At the very least you have to write a script to repeatedly renice the process back to the top when it tries to protect you from the excessive thermal load lmao