Overview
Request 3801 (accepted)
- Update to version 18.0.2:
* UI/updater: Fix temp files being created and not deleted
* UI/updater: Fix potential fail case when no files to patch
* UI/updater: Fixed a bug with deflating
* UI/updater: Ignore 64bit files on 32bit windows
* CI: Use ccache to speed up the build
* CI: OSX - Fix obs.png
* UI/updater: Fix incorrect inflate use
* CI: Linux - Install libfdk-aac-dev
* image-source: Move file modification check before animation processing
* UI: Add workaround to fix deleting final scene bug
* rtmp-services: Update ingest list for Restream.io
* rtmp-services: Update maximum bitrate for Twitch
* UI: Fix segfault when no system tray exists
* CI: Linux - Install FFmpeg from source
* obs-ffmpeg/nvenc: Remove "default" preset
* libobs: Add obs_source_copy_filters function
* UI: Add copying/pasting of sources/filters
* UI: Disable filter pasting when scene collection changed
* UI: Fix bug where items can't be deleted in last scene
* libobs: Remove unimplemented exports
* rtmp-services: Add Livestream service
* win-dshow: Fix issue with activating when not set to
* rtmp-services: Update Picarto maximum bitrates
* libobs: Delay stop detection of audio source
* libobs: Allow source to fully control source flags (for now)
* libobs: Add ability to preload async frames
* libobs: Remove multiple calls to free_type_data
* deps: Add media-playback static lib
* obs-ffmpeg: Change from libff to media-playback
- Created by boombatower about 8 years ago
- In state accepted
-
Package maintainers:
boombatower,
darix, and
frispete
obs-studio.changes
Changed
-------------------------------------------------------------------
+Tue May 02 19:05:40 UTC 2017 - jimmy@boombatower.com
+
+- Update to version 18.0.2:
+ * UI/updater: Fix temp files being created and not deleted
+ * UI/updater: Fix potential fail case when no files to patch
+ * UI/updater: Fixed a bug with deflating
+ * UI/updater: Ignore 64bit files on 32bit windows
+ * CI: Use ccache to speed up the build
+ * CI: OSX - Fix obs.png
+ * UI/updater: Fix incorrect inflate use
+ * CI: Linux - Install libfdk-aac-dev
+ * image-source: Move file modification check before animation processing
+ * UI: Add workaround to fix deleting final scene bug
+ * rtmp-services: Update ingest list for Restream.io
+ * rtmp-services: Update maximum bitrate for Twitch
+ * UI: Fix segfault when no system tray exists
+ * CI: Linux - Install FFmpeg from source
+ * obs-ffmpeg/nvenc: Remove "default" preset
+ * libobs: Add obs_source_copy_filters function
+ * UI: Add copying/pasting of sources/filters
+ * UI: Disable filter pasting when scene collection changed
+ * UI: Fix bug where items can't be deleted in last scene
+ * libobs: Remove unimplemented exports
+ * rtmp-services: Add Livestream service
+ * win-dshow: Fix issue with activating when not set to
+ * rtmp-services: Update Picarto maximum bitrates
+ * libobs: Delay stop detection of audio source
+ * libobs: Allow source to fully control source flags (for now)
+ * libobs: Add ability to preload async frames
+ * libobs: Remove multiple calls to free_type_data
+ * deps: Add media-playback static lib
+ * obs-ffmpeg: Change from libff to media-playback
+ * deps/libff: Remove network init
+ * UI: Remove libff as a dependency
+ * deps/libff: Don't build libff (deprecated)
+ * obs-ffmpeg: Remove unnecessary open call
+ * obs-ffmpeg: Always open on update unless set otherwise
+ * obs-ffmpeg: Fix bug on non-MSVC compilers
+ * UI: Fix property widgets not being disabled
+ * mac-avcapture: Ability to directly add iOS devices over USB
+ * audio-monitoring: Add ability to monitor Outputs
+ * decklink: Add option to select channel format
+ * decklink: Add workaround for audio timestamp jump issue
+ * Improve README/CONTRIBUTING files
+ * win-dshow: Fix reallocation issue in ffmpeg-decode
+ * UI: Add window name to remux dialog
+ * UI: Hide OpenGL and D3D adapter on Windows
+ * UI: Continue to show OpenGL if already in use
+ * UI: Increase MAX_CRASH_REPORT_SIZE to 150 KB
+ * CI: Use webhooks for notifications
+ * CI: Fix notification frequency
+ * libobs-opengl: Log OpenGL version on all systems
+ * Fix various typos across multiple modules
+ * Update Linux kernel coding style URL in CONTRIBUTING
+ * UI: Ctrl+E to Edit Transform
+ * UI: Remove unused defines from old updater code
+ * win-capture: Log if shared texture capture is unavailable
+ * win-capture: Update get-graphics-offsets
+ * win-capture: Add missing 32 bit offsets
+ * win-capture: Fix and clarify window capture prioritization
+ * UI: Add front-end API functions to get/modify service
+ * UI: Display filename when dragging & dropping
+ * obs-outputs: Always call RTMP_Init before connecting
+ * UI: Make sure all dialogs have close buttons
+ * UI: Add command line option for starting up always on top
+ * frontend-tools: Rename some files
+ * frontend-plugins: Abstract captions
+ * enc-amf: Update to v2.1.0(.0)
+ * win-ivcam: Fix potential null pointer dereference
+ * libobs: Update to 18.0.2 (windows hotfix)
+ * UI/updater: Add opt to disable building update module
+
+-------------------------------------------------------------------
Tue Mar 07 05:13:13 UTC 2017 - jimmy@boombatower.com
- Update to version 18.0.1:
obs-studio.spec
Changed
Name: obs-studio
-Version: 18.0.1
+Version: 18.0.2
Release: 0
Summary: A recording/broadcasting program
_service
Changed
<services>
<service name="tar_scm" mode="disabled">
<param name="versionformat">@PARENT_TAG@</param>
- <param name="revision">refs/tags/18.0.1</param>
+ <param name="revision">refs/tags/18.0.2</param>
<param name="url">git://github.com/jp9000/obs-studio.git</param>
<param name="scm">git</param>
<param name="changesgenerate">enable</param>
_servicedata
Changed
<servicedata>
<service name="tar_scm">
<param name="url">git://github.com/jp9000/obs-studio.git</param>
- <param name="changesrevision">bc9a58174b75c445dcb14259443a74746f0b3d43</param>
+ <param name="changesrevision">4684294bcd53193f4227fb31214a6417fdcd2b97</param>
</service>
</servicedata>
obs-studio-18.0.1.tar.xz/CONTRIBUTING
Deleted
-Contributing Information:
-
- - Our bug tracker is located at (currently linked to forum accounts):
- https://obsproject.com/mantis/
-
- - Development forums are currently located at:
- https://obsproject.com/forum/list/general-development.21/
-
- - Development IRC channel is primarily #obs-dev on QuakeNet
-
- - To contribute translations, see:
- https://obsproject.com/forum/threads/how-to-contribute-translations-for-obs.16327/
-
-
-Contributing Guidelines:
-
- - Code and commits will be reviewed. Reviews will be blunt and honest, and
- your change probably won't go through the first time, or may be outright
- rejected. It may require many revisions before changes are finally
- accepted.
-
- - If you want to avoid doing a lot of work only to have it rejected, discuss
- what you want to change it in the main channels/forums/mailing lists before
- actually working on it. Open source requires thick skin. Please don't be
- discouraged if your changes don't go through, learn from it and get better.
-
- - Coding style is Linux-style KNF (kernel normal form). This means K&R, 80
- columns max, preferable maximum function length of approximately 42 lines, 8
- character width tabs, lower_case_names, etc. I chose this for the sake of
- the project. Don't argue about it, just do it. It takes no time to do.
-
- See https://www.kernel.org/doc/Documentation/CodingStyle for a general
- guideline (though not necessarily a rulebook, for example we allow the use
- of boolean return values instead of ints for failure).
-
- NOTE: C++ is an exception to the lower_case_only rule, CamelCase is
- encouraged (though not required) to distinguish it from C code. Just a
- personal and subjective stylistic thing on my part.
-
- - Commits are not just changes to code; they should also be treated as
- annotation to code. For that reason, do not put unrelated changes in a
- single commit. Separate out different changes in to different commits, and
- make separate pull requests for unrelated changes. Commits should be
- formatted with the 50/72 standard, and should be descriptive and concise.
- See http://chris.beams.io/posts/git-commit/ for a summary of how to make
- good commit messages.
-
- - Core code is C only (unless there's an operating system specific thing that
- absolutely requires another language).
-
- - Modules and UI may use C, C++, or Objective-C (for apple), though please try
- to use C unless an API you're using requires C++ or Objective-C (such as
- windows COM classes, or apple Objective-C APIs).
-
- - If you don't quite know what to work on and just want to help, the bug
- tracker has a list of things that need to be worked on.
-
- - Try to respect the wishes of the author(s)/maintainer(s). A good maintainer
- will always listen, and will often ask others on the project for their
- opinions, but don't expect things to be completely democratic.
-
- - Do not use dependencies for the sake of convenience. There's enough
- dependencies as it is. Use them only if you absolutely have to depend on
- them.
obs-studio-18.0.1.tar.xz/README
Deleted
-
-What is OBS?
-
- This project is a rewrite of what was formerly known as "Open Broadcaster
- Software", software originally designed for recording and streaming live
- video content, efficiently.
-
-Bug Tracker: https://obsproject.com/mantis/
-
- We are no longer using GitHub issues! Please use Mantis, and only report
- bugs and major issues. Do NOT use mantis to ask questions or request
- features, please keep that to the forums.
-
- Forum accounts are now linked to Mantis Bug Tracker. To use the bug
- tracker, simply log in to the forums and then go to the bug tracker link
- above.
-
-What's the goal of rewriting OBS?
-
- - Make it multiplatform. Use multiplatform libraries/functions/classes where
- possible to allow this. Multi-platform support was one of the primary
- reasons for the rewrite. This also means using a UI toolkit will be
- necessary for user interface. It also means allowing the use of OpenGL as
- well as Direct3D.
-
- - Separate the application from the core, allowing custom application of
- the core if desired, and easier extending of the user interface.
-
- - Simplify complex systems to not only make it easier to use, but easier to
- maintain.
-
- - Write a better core API, and design the entire system to be modular.
-
- - Now that we have much more experience, improve the overall design of all
- the subsystems/API, and minimize/eliminate design flaws. Make it so we can
- do all the things we've had trouble with before, such as custom outputs,
- multiple outputs at once, better handling of inputs, custom services.
-
- - Make a better/cleaner code base, use better coding standards, use standard
- libraries where possible (not just STL and C standard library, but also
- things like ffmpeg as well), and improve maintainability of the project as a
- whole.
-
- - Implement a new API-independent shader/effect system allowing better and
- easier shaders usage and customization without having to duplicate shader
- code.
-
- - Better device support. Again, I didn't know what I was getting into when
- I originally started writing code for devices. It evolved into a totally
- convoluted mess. I would have improved the existing device plugin code, but
- it was just all so fundamentally bad and flawed that it would have been
- detrimental to progression to continue working on it rather than rewrite it.
-
-
-What was wrong with the original OBS?
-
- The original OBS was rewritten not because it was bad, at least in terms of
- optimization. Optimization and graphics are things I love. However, there
- were some serious problems with the code and design that were deep and
- fundamental, which prevented myself and other developers from being able to
- improve/extend the application or add new features very easily.
-
- First, the design flaws:
-
- - The original OBS was completely and hopelessly hard-coded for windows,
- and only windows. It was just totally impossible to use it on other
- systems.
-
- - All the sub-systems were written before I really knew what I was getting
- into. When I started the project, I didn't really fully comprehend the
- scope of what I would need or how to properly design the project. My
- design and plans for the application were just to write something that
- would "stream games and a webcam, with things like overlays and such."
- This turned out fine for most casual gamers and streamers (and very
- successful), but left anyone wanting to do anything more advanced left
- massively wanting.
-
- - Subsystems and core functionalities intermingled in such a way that it
- was a nightmare to get proper custom functionality out of it. Things
- like QSV had to be meshed in with the main encoding loop, and it just
- made things a nightmare to deal with. Custom outputs were nigh
- impossible.
-
- - The API was poorly designed because most of it came after I originally
- wrote the application, it was more of an afterthought, and plugin API
- would routinely break for plugin developers due to changing C++
- interfaces (one of the reasons the core is now C).
-
- - API was intermeshed with the main executable. The OBSApi DLL was
- nothing more than basically this mutant growth upon OBS.exe that allowed
- plugin developers to barely write plugins, but all the important API
- code was actually stored in the executable. Navigation was a total mess.
-
- - The graphics subsystem, while not bad, was incomplete, and though far
- easier to use than bare D3D, wasn't ideal, and was hard-coded for D3D
- specifically.
-
- - The devices and audio code was poor, I had no idea what I was getting into
- when I started writing them in. I did not realize beforehand all the
- device-specific quirks that each device/system could have. Some devices
- had bad timing and quirks that I never anticipated while writing them.
- I struggled with devices, and my original design for the audio subsystem
- for example morphed over and over into an abomination that, though works,
- is basically this giant duct-taped zombie monster.
-
- - Shaders were difficult to customize because they had to be duplicated if
- you wanted slightly different functionality that required more than just
- changing shader constants.
-
- - Orientation of sources was fixed, and required special code for each
- source to do any custom modification of rotation/position/scale/etc.
- This is one of those fundamental flaws that I look back on and regret, as
- it was a stupid idea from the beginning. I originally thought I could
- get more accurate source position/sizes, but it just turned out to be
- totally bad. Should have been matrices from the beginning just like with
- a regular 3D engine.
-
- Second, the coding flaws:
-
- - The coding style was inconsistent.
-
- - C++98, C-Style C++, there was no exception usage, no STL. C++ used
- poorly.
-
- - Not Invented Here Syndrome everywhere. Custom string functions/classes,
- custom templates, custom everything everywhere. To be fair, it was all
- hand-me-down code from the early 2000s that I had become used to, but
- that was no excuse -- C-standard libraries and the STL should have been
- used from the beginning over anything else. That doesn't mean to say
- that using custom stuff is always bad, but doing it to the extent I did
- definitely was. Made it horrible to maintain as well, required extra
- knowledge for plugin developers and anyone messing with the code.
-
- - Giant monolithic classes everywhere, the main OBS class was paricularly
- bad in this regard. This meant navigation was a nightmare, and no one
- really knew where to go or where to add/change things.
-
- - Giant monolithic functions everywhere. This was particularly bad
- because it meant that functions became harder to debug and harder to
- keep track of what was going on in any particular function at any given
- time. These large functions, though not inefficient, were delicate and
- easily breakable. (See OBS::MainCaptureLoop for a nightmarish example,
- or the listbox subclass window procedure in WindowStuff.cpp)
-
- - Very large file sizes with everything clumped up into single files (for
- another particularly nightmarish example, see WindowStuff.cpp)
-
- - Bad formatting. Code could go beyond 200 columns in some cases, making
- it very unpleasant to read with many editors. Spaces instead of tabs,
- K&R mixed with allman (which was admittedly my fault).
-
-
-New (actual) coding guidelines
-
- - For the C code (especially in the core), guidelines are pretty strict K&R,
- kernel style. See the linux kernel "CodingStyle" document for more
- information. That particular coding style guideline is for more than just
- style, it actually helps produce a better overall code base.
-
- - For C++ code, I still use CamelCase instead of all_lowercase just because
- I prefer it that way, it feels right with C++ for some reason. It also
- helps make it distinguishable from C code.
-
- - I've started using 8-column tabs for almost everything -- I really
- personally like it over 4-column tabs. I feel that 8-column tabs are very
- helpful in preventing large amounts of indentation. A self-imposed
- limitation, if you will. I also use actual tabs now, instead of spaces.
- Also, I feel that the K&R style looks much better/cleaner when viewed with
- 8-column tabs.
-
- - Preferred maximum columns: 80. I've also been doing this because in
- combination with 8-column tabs, it further prevents large/bad functions
- with high indentation. Another self-imposed limitation. Also, it makes
- for much cleaner viewing in certain editors that wrap (like vim).
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/captions-stream.cpp
Deleted
-#include "captions-stream.hpp"
-#include <mmreg.h>
-#include <util/windows/CoTaskMemPtr.hpp>
-#include <util/threading.h>
-#include <util/base.h>
-
-using namespace std;
-
-#if 0
-#define debugfunc(format, ...) blog(LOG_DEBUG, "[Captions] %s(" format ")", \
- __FUNCTION__, ##__VA_ARGS__)
-#else
-#define debugfunc(format, ...)
-#endif
-
-CaptionStream::CaptionStream(DWORD samplerate_) :
- samplerate(samplerate_),
- event(CreateEvent(nullptr, false, false, nullptr))
-{
- buf_info.ulMsMinNotification = 50;
- buf_info.ulMsBufferSize = 500;
- buf_info.ulMsEventBias = 0;
-
- format.wFormatTag = WAVE_FORMAT_PCM;
- format.nChannels = 1;
- format.nSamplesPerSec = 16000;
- format.nAvgBytesPerSec = format.nSamplesPerSec * sizeof(uint16_t);
- format.nBlockAlign = 2;
- format.wBitsPerSample = 16;
- format.cbSize = sizeof(format);
-
- resampler.Reset(&format);
-}
-
-void CaptionStream::Stop()
-{
- {
- lock_guard<mutex> lock(m);
- circlebuf_free(buf);
- }
-
- cv.notify_one();
-}
-
-void CaptionStream::PushAudio(const struct audio_data *data, bool muted)
-{
- uint8_t *output[MAX_AV_PLANES] = {};
- uint32_t frames = data->frames;
- uint64_t ts_offset;
- bool ready = false;
-
- audio_resampler_resample(resampler, output, &frames, &ts_offset,
- data->data, data->frames);
-
- if (output[0]) {
- if (muted)
- memset(output[0], 0, frames * sizeof(int16_t));
-
- lock_guard<mutex> lock(m);
- circlebuf_push_back(buf, output[0], frames * sizeof(int16_t));
- write_pos += frames * sizeof(int16_t);
-
- if (wait_size && buf->size >= wait_size)
- ready = true;
- }
-
- if (ready)
- cv.notify_one();
-}
-
-// IUnknown methods
-
-STDMETHODIMP CaptionStream::QueryInterface(REFIID riid, void **ppv)
-{
- if (riid == IID_IUnknown) {
- AddRef();
- *ppv = this;
-
- } else if (riid == IID_IStream) {
- AddRef();
- *ppv = (IStream*)this;
-
- } else if (riid == IID_ISpStreamFormat) {
- AddRef();
- *ppv = (ISpStreamFormat*)this;
-
- } else if (riid == IID_ISpAudio) {
- AddRef();
- *ppv = (ISpAudio*)this;
-
- } else {
- *ppv = nullptr;
- return E_NOINTERFACE;
- }
-
- return NOERROR;
-}
-
-STDMETHODIMP_(ULONG) CaptionStream::AddRef()
-{
- return (ULONG)os_atomic_inc_long(&refs);
-}
-
-STDMETHODIMP_(ULONG) CaptionStream::Release()
-{
- ULONG new_refs = (ULONG)os_atomic_dec_long(&refs);
- if (!new_refs)
- delete this;
-
- return new_refs;
-}
-
-// ISequentialStream methods
-
-STDMETHODIMP CaptionStream::Read(void *data, ULONG bytes, ULONG *read_bytes)
-{
- HRESULT hr = S_OK;
- size_t cur_size;
-
- debugfunc("data, %lu, read_bytes", bytes);
- if (!data)
- return STG_E_INVALIDPOINTER;
-
- {
- lock_guard<mutex> lock1(m);
- wait_size = bytes;
- cur_size = buf->size;
- }
-
- unique_lock<mutex> lock(m);
-
- if (bytes > cur_size)
- cv.wait(lock);
-
- if (bytes > (ULONG)buf->size) {
- bytes = (ULONG)buf->size;
- hr = S_FALSE;
- }
- if (bytes)
- circlebuf_pop_front(buf, data, bytes);
- if (read_bytes)
- *read_bytes = bytes;
-
- wait_size = 0;
- pos.QuadPart += bytes;
- return hr;
-}
-
-STDMETHODIMP CaptionStream::Write(const void *, ULONG bytes,
- ULONG*)
-{
- debugfunc("data, %lu, written_bytes", bytes);
- UNUSED_PARAMETER(bytes);
-
- return STG_E_INVALIDFUNCTION;
-}
-
-// IStream methods
-
-STDMETHODIMP CaptionStream::Seek(LARGE_INTEGER move, DWORD origin,
- ULARGE_INTEGER *new_pos)
-{
- debugfunc("%lld, %lx, new_pos", move, origin);
- UNUSED_PARAMETER(move);
- UNUSED_PARAMETER(origin);
-
- if (!new_pos)
- return E_POINTER;
-
- if (origin != SEEK_CUR || move.QuadPart != 0)
- return E_NOTIMPL;
-
- *new_pos = pos;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::SetSize(ULARGE_INTEGER new_size)
-{
- debugfunc("%llu", new_size);
- UNUSED_PARAMETER(new_size);
- return STG_E_INVALIDFUNCTION;
-}
-
-STDMETHODIMP CaptionStream::CopyTo(IStream *stream, ULARGE_INTEGER bytes,
- ULARGE_INTEGER *read_bytes,
- ULARGE_INTEGER *written_bytes)
-{
- HRESULT hr;
-
- debugfunc("stream, %llu, read_bytes, written_bytes", bytes);
-
- if (!stream)
- return STG_E_INVALIDPOINTER;
-
- ULONG written = 0;
- if (bytes.QuadPart > (ULONGLONG)buf->size)
- bytes.QuadPart = (ULONGLONG)buf->size;
-
- lock_guard<mutex> lock(m);
- temp_buf.resize((size_t)bytes.QuadPart);
- circlebuf_peek_front(buf, &temp_buf[0], (size_t)bytes.QuadPart);
-
- hr = stream->Write(temp_buf.data(), (ULONG)bytes.QuadPart, &written);
-
- if (read_bytes)
- *read_bytes = bytes;
- if (written_bytes)
- written_bytes->QuadPart = written;
-
- return hr;
-}
-
-STDMETHODIMP CaptionStream::Commit(DWORD commit_flags)
-{
- debugfunc("%lx", commit_flags);
- UNUSED_PARAMETER(commit_flags);
- /* TODO? */
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::Revert(void)
-{
- debugfunc("");
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::LockRegion(ULARGE_INTEGER offset,
- ULARGE_INTEGER size, DWORD type)
-{
- debugfunc("%llu, %llu, %ld", offset, size, type);
- UNUSED_PARAMETER(offset);
- UNUSED_PARAMETER(size);
- UNUSED_PARAMETER(type);
- /* TODO? */
- return STG_E_INVALIDFUNCTION;
-}
-
-STDMETHODIMP CaptionStream::UnlockRegion(ULARGE_INTEGER offset,
- ULARGE_INTEGER size, DWORD type)
-{
- debugfunc("%llu, %llu, %ld", offset, size, type);
- UNUSED_PARAMETER(offset);
- UNUSED_PARAMETER(size);
- UNUSED_PARAMETER(type);
- /* TODO? */
- return STG_E_INVALIDFUNCTION;
-}
-
-static const wchar_t *stat_name = L"Caption stream";
-
-STDMETHODIMP CaptionStream::Stat(STATSTG *stg, DWORD flag)
-{
- debugfunc("stg, %lu", flag);
-
- if (!stg)
- return E_POINTER;
-
- lock_guard<mutex> lock(m);
- *stg = {};
- stg->type = STGTY_STREAM;
- stg->cbSize.QuadPart = (ULONGLONG)buf->size;
-
- if (flag == STATFLAG_DEFAULT) {
- stg->pwcsName = (wchar_t*)CoTaskMemAlloc(sizeof(stat_name));
- memcpy(stg->pwcsName, stat_name, sizeof(stat_name));
- }
-
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::Clone(IStream **stream)
-{
- debugfunc("stream");
- *stream = nullptr;
- return E_NOTIMPL;
-}
-
-// ISpStreamFormat methods
-
-STDMETHODIMP CaptionStream::GetFormat(GUID *guid,
- WAVEFORMATEX **co_mem_wfex_out)
-{
- debugfunc("guid, co_mem_wfex_out");
-
- if (!guid || !co_mem_wfex_out)
- return E_POINTER;
-
- if (format.wFormatTag == 0) {
- *co_mem_wfex_out = nullptr;
- return S_OK;
- }
-
- void *wfex = CoTaskMemAlloc(sizeof(format));
- memcpy(wfex, &format, sizeof(format));
-
- *co_mem_wfex_out = (WAVEFORMATEX*)wfex;
- return S_OK;
-}
-
-// ISpAudio methods
-
-STDMETHODIMP CaptionStream::SetState(SPAUDIOSTATE state_, ULONGLONG)
-{
- debugfunc("%lu, reserved", state_);
- state = state_;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::SetFormat(REFGUID guid_ref,
- const WAVEFORMATEX *wfex)
-{
- debugfunc("guid, wfex");
- if (!wfex)
- return E_INVALIDARG;
-
- if (guid_ref == SPDFID_WaveFormatEx) {
- lock_guard<mutex> lock(m);
- memcpy(&format, wfex, sizeof(format));
- resampler.Reset(wfex);
-
- /* 50 msec */
- DWORD size = format.nSamplesPerSec / 20;
- DWORD byte_size = size * format.nBlockAlign;
- circlebuf_reserve(buf, (size_t)byte_size);
- }
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::GetStatus(SPAUDIOSTATUS *status)
-{
- debugfunc("status");
-
- if (!status)
- return E_POINTER;
-
- /* TODO? */
- lock_guard<mutex> lock(m);
- *status = {};
- status->cbNonBlockingIO = (ULONG)buf->size;
- status->State = state;
- status->CurSeekPos = pos.QuadPart;
- status->CurDevicePos = write_pos;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::SetBufferInfo(const SPAUDIOBUFFERINFO *buf_info_)
-{
- debugfunc("buf_info");
-
- /* TODO */
- buf_info = *buf_info_;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::GetBufferInfo(SPAUDIOBUFFERINFO *buf_info_)
-{
- debugfunc("buf_info");
- if (!buf_info_)
- return E_POINTER;
-
- *buf_info_ = buf_info;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::GetDefaultFormat(GUID *format,
- WAVEFORMATEX **co_mem_wfex_out)
-{
- debugfunc("format, co_mem_wfex_out");
-
- if (!format || !co_mem_wfex_out)
- return E_POINTER;
-
- void *wfex = CoTaskMemAlloc(sizeof(format));
- memcpy(wfex, &format, sizeof(format));
-
- *format = SPDFID_WaveFormatEx;
- *co_mem_wfex_out = (WAVEFORMATEX*)wfex;
- return S_OK;
-}
-
-STDMETHODIMP_(HANDLE) CaptionStream::EventHandle(void)
-{
- debugfunc("");
- return event;
-}
-
-STDMETHODIMP CaptionStream::GetVolumeLevel(ULONG *level)
-{
- debugfunc("level");
- if (!level)
- return E_POINTER;
-
- *level = vol;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::SetVolumeLevel(ULONG level)
-{
- debugfunc("%lu", level);
- vol = level;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::GetBufferNotifySize(ULONG *size)
-{
- debugfunc("size");
- if (!size)
- return E_POINTER;
- *size = notify_size;
- return S_OK;
-}
-
-STDMETHODIMP CaptionStream::SetBufferNotifySize(ULONG size)
-{
- debugfunc("%lu", size);
- notify_size = size;
- return S_OK;
-}
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/captions-stream.hpp
Deleted
-#include <windows.h>
-#include <sapi.h>
-#include <condition_variable>
-#include <mutex>
-#include <vector>
-#include <obs.h>
-#include <media-io/audio-resampler.h>
-#include <util/circlebuf.h>
-#include <util/windows/WinHandle.hpp>
-
-#include <fstream>
-
-class CircleBuf {
- circlebuf buf = {};
-public:
- inline ~CircleBuf() {circlebuf_free(&buf);}
- inline operator circlebuf*() {return &buf;}
- inline circlebuf *operator->() {return &buf;}
-};
-
-class Resampler {
- audio_resampler_t *resampler = nullptr;
-
-public:
- inline void Reset(const WAVEFORMATEX *wfex)
- {
- const struct audio_output_info *aoi =
- audio_output_get_info(obs_get_audio());
-
- struct resample_info src;
- src.samples_per_sec = aoi->samples_per_sec;
- src.format = aoi->format;
- src.speakers = aoi->speakers;
-
- struct resample_info dst;
- dst.samples_per_sec = uint32_t(wfex->nSamplesPerSec);
- dst.format = AUDIO_FORMAT_16BIT;
- dst.speakers = (enum speaker_layout)wfex->nChannels;
-
- if (resampler)
- audio_resampler_destroy(resampler);
- resampler = audio_resampler_create(&dst, &src);
- }
-
- inline ~Resampler() {audio_resampler_destroy(resampler);}
- inline operator audio_resampler_t*() {return resampler;}
-};
-
-class CaptionStream : public ISpAudio {
- volatile long refs = 1;
- SPAUDIOBUFFERINFO buf_info = {};
- ULONG notify_size = 0;
- SPAUDIOSTATE state;
- WinHandle event;
- ULONG vol = 0;
-
- std::condition_variable cv;
- std::mutex m;
- std::vector<int16_t> temp_buf;
- WAVEFORMATEX format = {};
- Resampler resampler;
-
- CircleBuf buf;
- ULONG wait_size = 0;
- DWORD samplerate = 0;
- ULARGE_INTEGER pos = {};
- ULONGLONG write_pos = 0;
-
-public:
- CaptionStream(DWORD samplerate);
-
- void Stop();
- void PushAudio(const struct audio_data *audio_data, bool muted);
-
- // IUnknown methods
- STDMETHODIMP QueryInterface(REFIID riid, void **ppv) override;
- STDMETHODIMP_(ULONG) AddRef() override;
- STDMETHODIMP_(ULONG) Release() override;
-
- // ISequentialStream methods
- STDMETHODIMP Read(void *data, ULONG bytes, ULONG *read_bytes) override;
- STDMETHODIMP Write(const void *data, ULONG bytes, ULONG *written_bytes)
- override;
-
- // IStream methods
- STDMETHODIMP Seek(LARGE_INTEGER move, DWORD origin,
- ULARGE_INTEGER *new_pos) override;
- STDMETHODIMP SetSize(ULARGE_INTEGER new_size) override;
- STDMETHODIMP CopyTo(IStream *stream, ULARGE_INTEGER bytes,
- ULARGE_INTEGER *read_bytes,
- ULARGE_INTEGER *written_bytes) override;
- STDMETHODIMP Commit(DWORD commit_flags) override;
- STDMETHODIMP Revert(void) override;
- STDMETHODIMP LockRegion(ULARGE_INTEGER offset, ULARGE_INTEGER size,
- DWORD type) override;
- STDMETHODIMP UnlockRegion(ULARGE_INTEGER offset, ULARGE_INTEGER size,
- DWORD type) override;
- STDMETHODIMP Stat(STATSTG *stg, DWORD flags) override;
- STDMETHODIMP Clone(IStream **stream) override;
-
- // ISpStreamFormat methods
- STDMETHODIMP GetFormat(GUID *guid, WAVEFORMATEX **co_mem_wfex_out)
- override;
-
- // ISpAudio methods
- STDMETHODIMP SetState(SPAUDIOSTATE state, ULONGLONG reserved) override;
- STDMETHODIMP SetFormat(REFGUID guid_ref, const WAVEFORMATEX *wfex)
- override;
- STDMETHODIMP GetStatus(SPAUDIOSTATUS *status) override;
- STDMETHODIMP SetBufferInfo(const SPAUDIOBUFFERINFO *buf_info) override;
- STDMETHODIMP GetBufferInfo(SPAUDIOBUFFERINFO *buf_info) override;
- STDMETHODIMP GetDefaultFormat(GUID *format,
- WAVEFORMATEX **co_mem_wfex_out) override;
- STDMETHODIMP_(HANDLE) EventHandle(void) override;
- STDMETHODIMP GetVolumeLevel(ULONG *level) override;
- STDMETHODIMP SetVolumeLevel(ULONG level) override;
- STDMETHODIMP GetBufferNotifySize(ULONG *size) override;
- STDMETHODIMP SetBufferNotifySize(ULONG size) override;
-};
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/amf-h264.h
Deleted
-/*
-MIT License
-
-Copyright (c) 2016 Michael Fabian Dirks
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
-*/
-
-#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-#include <condition_variable>
-#include <algorithm>
-#include <mutex>
-#include <queue>
-#include <thread>
-#include <vector>
-#include <chrono>
-
-// Plugin
-#include "plugin.h"
-#include "amf.h"
-#include "api-base.h"
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
-
-namespace Plugin {
- namespace AMD {
- // Internal Properties
- enum class H264EncoderType : uint8_t {
- AVC = 0, // Advanced Video Coding
- SVC, // Scalable Video Coding
- HEVC, // High-Efficiency Video Coding (Discovered in amfrt64.dll)
- };
- enum class H264MemoryType : uint8_t {
- Host = 0, // Host-Managed Memory
- DirectX9, // DirectX9
- DirectX11, // DirectX11
- OpenGL, // OpenGL
- };
- enum class H264ColorFormat : uint8_t {
- // 4:2:0 Formats
- NV12 = 0, // NV12
- I420, // YUV 4:2:0
- // 4:2:2 Formats
- YUY2,
- // Uncompressed
- BGRA, // ARGB
- RGBA, // RGBA
- // Other
- GRAY, // Y 4:0:0
- };
- enum class H264ColorProfile : uint8_t {
- Rec601 = 0,
- Rec709,
- Rec2020, // Truer to world color, used for HDR
- };
-
- // Static Properties
- enum class H264Usage : uint8_t {
- Transcoding = 0, // Only one capable of streaming to services.
- UltraLowLatency, // Low Latency Recording or Network Streaming
- LowLatency, // Low Latency Recording or Network Streaming
- Webcam, // For SVC Recording and Streaming
- };
- enum class H264QualityPreset : uint8_t {
- Speed = 0,
- Balanced,
- Quality,
- };
- enum class H264Profile : uint16_t {
- Baseline = 66,
- Main = 77,
- High = 100,
- ConstrainedBaseline = 256,
- ConstrainedHigh = 257
- };
- enum class H264ProfileLevel : uint8_t {
- Automatic = 0,
- L10 = 10,
- L11,
- L12,
- L13,
- L20 = 20,
- L21,
- L22,
- L30 = 30,
- L31,
- L32,
- L40 = 40,
- L41,
- L42,
- L50 = 50,
- L51,
- L52,
- };
- enum class H264ScanType : uint8_t {
- Progressive = 0,
- Interlaced,
- };
- enum class H264CodingType : uint8_t {
- Default = 0,
- CABAC = 1,
- CALVC = 2,
- };
-
- // Dynamic Properties
- enum class H264RateControlMethod : uint8_t {
- ConstantQP = 0,
- ConstantBitrate,
- VariableBitrate_PeakConstrained,
- VariableBitrate_LatencyConstrained = 3,
- };
- enum class H264BFramePattern : uint8_t {
- None = 0,
- One,
- Two,
- Three,
- };
-
- // Experimental
- enum class H264SliceMode : uint8_t {
- Horizontal = 1,
- Vertical = 2
- };
- enum class H264SliceControlMode : uint8_t {
- Off = 0,
- Macroblock = 1, // AMF_VIDEO_ENCODER_SLICE_CTRL_MODE_MB
- Invalid,
- Macroblock_Row = 3 // AMF_VIDEO_ENCODER_SLICE_CTRL_MODE_MB_ROW
- };
-
- class H264Encoder {
- #pragma region Initializer & Finalizer
- public:
- H264Encoder(
- H264EncoderType p_Type,
- std::string p_VideoAPI,
- uint64_t p_VideoAdapterId,
- bool p_OpenCL,
- H264ColorFormat p_SurfaceFormat = H264ColorFormat::NV12
- );
- ~H264Encoder();
- #pragma endregion Initializer & Finalizer
-
- public:
- void Start();
- void Restart();
- void Stop();
- bool IsStarted();
-
- bool SendInput(struct encoder_frame* frame);
- bool GetOutput(struct encoder_packet* packet, bool* received_packet);
- bool GetExtraData(uint8_t**& data, size_t*& size);
- void GetVideoInfo(struct video_scale_info*& vsi);
-
- #pragma region Properties
- public:
- void LogProperties();
-
- // Static
-
- #pragma region Startup Properties
- // Set which Usage preset to use.
- // Changing this will also change a lot of other properties.
- void SetUsage(H264Usage usage);
- H264Usage GetUsage();
-
- // Set which Quality Preset AMF should use.
- // Affects the quality of the output.
- void SetQualityPreset(H264QualityPreset preset);
- H264QualityPreset GetQualityPreset();
-
- // Set the Profile the output should have.
- void SetProfile(H264Profile profile);
- H264Profile GetProfile();
-
- // Set the Profile Level the output should have.
- void SetProfileLevel(H264ProfileLevel level);
- H264ProfileLevel GetProfileLevel();
- #pragma endregion Startup Properties
-
- #pragma region Frame Properties
- // Set which Color Profile the input frame is.
- void SetColorProfile(H264ColorProfile profile);
- H264ColorProfile GetColorProfile();
-
- // Set if the input frame is in full color range.
- void SetFullRangeColorEnabled(bool enabled);
- bool IsFullRangeColorEnabled();
-
- // Resolution for the input and output.
- void SetResolution(uint32_t width, uint32_t height);
- std::pair<uint32_t, uint32_t> GetResolution();
-
- // Framerate of the input and output.
- void SetFrameRate(uint32_t num, uint32_t den);
- std::pair<uint32_t, uint32_t> GetFrameRate();
-
- // Scanning method for input (and output?).
- void SetScanType(H264ScanType scanType);
- H264ScanType GetScanType();
- #pragma endregion Frame Properties
-
- // Dynamic
-
- #pragma region Rate Control
- /* Selects the rate control method:
- * - CQP - Constrained QP,
- * - CBR - Constant Bitrate,
- * - VBR - Peak Constrained VBR,
- * - VBR_LAT - Latency Constrained VBR
- *
- * Remarks:
- * - When SVC encoding is enabled, all Rate-control parameters (with some restrictions) can be configured differently for a particular SVC-layer. An SVC-layer is denoted by an index pair [SVC-Temporal Layer index][SVC-Quality Layer index]. E.g. The bitrate may be configured differently for SVC-layers [0][0] and [1][0].
- * - We restrict all SVC layers to have the same Rate Control method. Some RC parameters are not enabled with SVC encoding (e.g. all parameters related to B-Frames).
- **/
- void SetRateControlMethod(H264RateControlMethod method);
- H264RateControlMethod GetRateControlMethod();
-
- /* Sets the target bitrate */
- void SetTargetBitrate(uint32_t bitrate);
- uint32_t GetTargetBitrate();
-
- /* Sets the peak bitrate */
- void SetPeakBitrate(uint32_t bitrate);
- uint32_t GetPeakBitrate();
-
- /* Sets the minimum QP */
- void SetMinimumQP(uint8_t qp);
- uint8_t GetMinimumQP();
-
- /* Sets the maximum QP */
- void SetMaximumQP(uint8_t qp);
- uint8_t GetMaximumQP();
-
- // Set the fixed QP value for I-Frames.
- void SetIFrameQP(uint8_t qp);
- uint8_t GetIFrameQP();
-
- // Set the fixed QP value for P-Frames.
- void SetPFrameQP(uint8_t qp);
- uint8_t GetPFrameQP();
-
- // Set the fixed QP value for B-Frames.
- void SetBFrameQP(uint8_t qp);
- uint8_t GetBFrameQP();
-
- // Set the Video Buffer Verifier (VBV) size in bits per second (bps).
- void SetVBVBufferSize(uint32_t size);
- // Set the Video Buffer Verifier (VBV) size using a strictness constraint.
- void SetVBVBufferAutomatic(double_t strictness);
- uint32_t GetVBVBufferSize();
-
- /* Sets the initial VBV Buffer Fullness */
- void SetInitialVBVBufferFullness(double_t fullness);
- double_t GetInitialVBVBufferFullness();
-
- /* Enables/Disables filler data */
- void SetFillerDataEnabled(bool enabled);
- bool IsFillerDataEnabled();
-
- /* Enables skip frame for rate control */
- void SetFrameSkippingEnabled(bool enabled);
- bool IsFrameSkippingEnabled();
-
- /* Enables/Disables constraints on QP variation within a picture to meet HRD requirement(s) */
- void SetEnforceHRDRestrictionsEnabled(bool enforce);
- bool IsEnforceHRDRestrictionsEnabled();
- #pragma endregion Rate Control
-
- #pragma region Picture Control
- // Set the Instantaneous-Decoder-Refresh (IDR) Period in frames.
- void SetIDRPeriod(uint32_t period);
- uint32_t GetIDRPeriod();
-
- #pragma region B-Frames
- /* Sets the number of consecutive B-Frames. BFramesPattern = 0 indicates that B-Frames are not used */
- void SetBFramePattern(H264BFramePattern pattern);
- H264BFramePattern GetBFramePattern();
-
- /* Selects the delta QP of non-reference B-Frames with respect to the last non-B-Frame */
- void SetBFrameDeltaQP(int8_t qp);
- int8_t GetBFrameDeltaQP();
-
- /* Enables or disables using B-Frames as references */
- void SetBFrameReferenceEnabled(bool enabled);
- bool IsBFrameReferenceEnabled();
-
- /* Selects delta QP of reference B-Frames with respect to the last non-B-Frame */
- void SetBFrameReferenceDeltaQP(int8_t qp);
- int8_t GetBFrameReferenceDeltaQP();
- #pragma endregion B-Frames
- #pragma endregion Picture Control
-
- #pragma region Miscellaneous
- /* Turns on/off the Deblocking Filter */
- void SetDeblockingFilterEnabled(bool enabled);
- bool IsDeblockingFilterEnabled();
-
- #pragma region Motion Estimation
- /* Turns on/off half-pixel motion estimation */
- void SetHalfPixelMotionEstimationEnabled(bool enabled);
- bool IsHalfPixelMotionEstimationEnabled();
-
- /* Turns on/off quarter-pixel motion estimation */
- void SetQuarterPixelMotionEstimationEnabled(bool enabled);
- bool IsQuarterPixelMotionEstimationEnabled();
- #pragma endregion Motion Estimation
- #pragma endregion Miscellaneous
-
- #pragma region Experimental Properties
- // Get the maximum amount of MBps the encoder can output.
- uint32_t GetMaxMBPerSec();
-
- /* Coding Type */
- void SetCodingType(H264CodingType type);
- H264CodingType GetCodingType();
-
- void SetWaitForTaskEnabled(bool enabled);
- bool IsWaitForTaskEnabled();
-
- // Preanalysis Pass is AMDs version of Two-Pass hardware encoding.
- void SetPreAnalysisPassEnabled(bool enabled);
- bool IsPreAnalysisPassEnabled();
-
- // VBAQ = Variable Bitrate Average Quality?
- // - EanbleVBAQ (bool)
- void SetVBAQEnabled(bool enabled);
- bool IsVBAQEnabled();
-
- /* Sets the headers insertion spacing */
- void SetHeaderInsertionSpacing(uint32_t spacing); // Similar to IDR Period, spacing (in frames) between headers.
- uint32_t GetHeaderInsertionSpacing();
-
- /* The number of long-term references controlled by the user.
- *
- * Remarks:
- * - When == 0, the encoder may or may not use LTRs during encoding.
- * - When >0, the user has control over all LTR.
- * - With user control of LTR, B-Frames and Intra-refresh features are not supported.
- * - The actual maximum number of LTRs allowed depends on H.264 Annex A Table A-1 Level limits, which defines dependencies between the H.264 Level number, encoding resolution, and DPB size. The DPB size limit impacts the maximum number of LTR allowed.
- **/
- void SetMaximumLongTermReferenceFrames(uint32_t maximumLTRFrames); // Long-Term Reference Frames. If 0, Encoder decides, if non-0 B-Frames and Intra-Refresh are not supported.
- uint32_t GetMaximumLongTermReferenceFrames();
-
- /* Sets Maximum AU Size in bits */
- void SetMaximumAccessUnitSize(uint32_t size);
- uint32_t GetMaximumAccessUnitSize();
-
- void SetMaximumReferenceFrames(uint32_t frameCount);
- uint32_t GetMaximumReferenceFrames();
-
- void SetAspectRatio(uint32_t x, uint32_t y);
- std::pair<uint32_t, uint32_t> GetAspectRatio();
-
- #pragma region Group of Pictures
- void SetGOPSize(uint32_t gopSize);
- uint32_t GetGOPSize();
-
- void SetGOPAlignmentEnabled(bool enabled);
- bool IsGOPAlignementEnabled();
- #pragma endregion Group of Pictures
-
- #pragma region Intra Refresh
- // Macroblocks per Intra-Refresh Slot
- // Intra-Refresh Coding
- void SetIntraRefreshMacroblocksPerSlot(uint32_t macroblocks);
- uint32_t GetIntraRefreshMacroblocksPerSlot();
-
- // - IntraRefreshNumOfStripes (0 - INT_MAX)
- // Intra-Refresh Coding
- void SetIntraRefreshNumberOfStripes(uint32_t stripes);
- uint32_t GetIntraRefreshNumberOfStripes();
- #pragma endregion Intra Refresh
-
- #pragma region Slicing
- /* Sets the number of slices per frame */
- void SetSlicesPerFrame(uint32_t slices);
- uint32_t GetSlicesPerFrame();
-
- // - SliceMode (1 - 2, Default is 1)
- void SetSliceMode(H264SliceMode mode);
- H264SliceMode GetSliceMode();
-
- // - MaxSliceSize (1 - INT_MAX)
- void SetMaximumSliceSize(uint32_t size);
- uint32_t GetMaximumSliceSize();
-
- // - SliceControlMode (0 - 3)
- void SetSliceControlMode(H264SliceControlMode mode);
- H264SliceControlMode GetSliceControlMode();
-
- // - SliceControlSize (0 - INT_MAX)
- void SetSliceControlSize(uint32_t size);
- uint32_t GetSliceControlSize();
- #pragma endregion Slicing
-
- // More:
- // - CodecId (H264 = 5, H264SVC = 8, 2xUNKNOWN)
- // - EngineType (Auto = 0, DX9 = 1, DX11 = 2, XVBA = 3)
- // - ConstraintSetFlags (0 - 255, 1 byte bitset?)
- // - LowLatencyInternal (bool)
- // - CommonLowLatencyInternal (bool)
- // - UniqueInstance (0 - INT_MAX)
- // - MultiInstanceMode (bool)
- // - MultiInstanceCurrentQueue (0 - 1)
- // - InstanceId (-1 - [# of Streams - 1])
- // - EncoderMaxInstances (1 - [# of Instances])
- #pragma endregion Experimental Properties
-
- #pragma endregion Properties
-
- // Threading
- private:
- static void InputThreadMain(Plugin::AMD::H264Encoder* p_this);
- void InputThreadLogic();
- static void OutputThreadMain(Plugin::AMD::H264Encoder* p_this);
- void OutputThreadLogic();
- inline amf::AMFSurfacePtr CreateSurfaceFromFrame(struct encoder_frame*& frame);
-
- #pragma region Members
- private:
- // AMF Data References
- std::shared_ptr<Plugin::AMD::AMF> m_AMF;
- amf::AMFFactory* m_AMFFactory;
- amf::AMFContextPtr m_AMFContext;
- amf::AMFComponentPtr m_AMFConverter;
- amf::AMFComponentPtr m_AMFEncoder;
- amf::AMFComputePtr m_AMFCompute;
-
- // API References
- std::shared_ptr<Plugin::API::Base> m_API;
- Plugin::API::Adapter m_APIAdapter;
- void* m_APIInstance;
-
- // Static Buffers
- std::vector<uint8_t> m_PacketDataBuffer;
- std::vector<uint8_t> m_ExtraDataBuffer;
-
- // Structured Queue
- struct {
- std::queue<amf::AMFSurfacePtr> queue;
-
- // Threading
- std::thread thread;
- std::mutex mutex;
- std::condition_variable condvar;
- std::mutex queuemutex;
- } m_Input;
- struct {
- std::queue<amf::AMFDataPtr> queue;
-
- // Threading
- std::thread thread;
- std::mutex mutex;
- std::condition_variable condvar;
- std::mutex queuemutex;
- } m_Output;
-
- // Internal Properties
- H264EncoderType m_EncoderType;
- H264MemoryType m_MemoryType;
- bool m_OpenCL;
- H264ColorFormat m_ColorFormat;
- bool m_Flag_IsStarted,
- m_Flag_FirstFrameSubmitted,
- m_Flag_FirstFrameReceived;
- std::pair<uint32_t, uint32_t> m_FrameSize,
- m_FrameRate;
- double_t m_FrameRateDivisor,
- m_FrameRateReverseDivisor;
- size_t m_InputQueueLimit,
- m_InputQueueLastSize;
- uint32_t m_TimerPeriod;
- H264ColorProfile m_ColorProfile;
- std::chrono::time_point<std::chrono::high_resolution_clock> m_LastQueueWarnMessageTime;
-
- #pragma endregion Members
- };
- }
-}
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/amf-h264.cpp
Deleted
-/*
-MIT License
-
-Copyright (c) 2016 Michael Fabian Dirks
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
-*/
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-#include <chrono>
-
-#include "amf-capabilities.h"
-#include "amf-h264.h"
-#include "misc-util.cpp"
-#include "api-base.h"
-
-// AMF
-#include "components/VideoEncoderVCE.h"
-#include "components/VideoConverter.h"
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
-
-#define AMF_PROPERTY_FRAME L"Frame"
-#define AMF_PROPERTY_TIME_SENDINPUT L"TimeSendInput"
-#define AMF_PROPERTY_TIME_CREATESURFACE L"TimeCreateSurface"
-#define AMF_PROPERTY_TIME_CONVERT L"TimeConvert"
-#define AMF_PROPERTY_TIME_ENCODE L"TimeEncode"
-
-// Logging and Exception Helpers
-static void FormatTextWithAMFError(std::vector<char>* buffer, const char* format, AMF_RESULT res) {
- sprintf(buffer->data(), format, Plugin::AMD::AMF::GetInstance()->GetTrace()->GetResultText(res), res);
-}
-
-template<typename _T>
-static void FormatTextWithAMFError(std::vector<char>* buffer, const char* format, _T other, AMF_RESULT res) {
- sprintf(buffer->data(), format, other, Plugin::AMD::AMF::GetInstance()->GetTrace()->GetResultText(res), res);
-}
-
-// Class
-#ifdef _DEBUG
-static void fastPrintVariant(const char* text, amf::AMFVariantStruct variant) {
- std::vector<char> buf(1024);
- switch (variant.type) {
- case amf::AMF_VARIANT_EMPTY:
- sprintf(buf.data(), "%s%s", text, "Empty");
- break;
- case amf::AMF_VARIANT_BOOL:
- sprintf(buf.data(), "%s%s", text, variant.boolValue ? "true" : "false");
- break;
- case amf::AMF_VARIANT_INT64:
- sprintf(buf.data(), "%s%lld", text, variant.int64Value);
- break;
- case amf::AMF_VARIANT_DOUBLE:
- sprintf(buf.data(), "%s%f", text, variant.doubleValue);
- break;
- case amf::AMF_VARIANT_RECT:
- sprintf(buf.data(), "%s[%ld,%ld,%ld,%ld]", text,
- variant.rectValue.top, variant.rectValue.left,
- variant.rectValue.bottom, variant.rectValue.right);
- break;
- case amf::AMF_VARIANT_SIZE:
- sprintf(buf.data(), "%s%ldx%ld", text,
- variant.sizeValue.width, variant.sizeValue.height);
- break;
- case amf::AMF_VARIANT_POINT:
- sprintf(buf.data(), "%s[%ld,%ld]", text,
- variant.pointValue.x, variant.pointValue.y);
- break;
- case amf::AMF_VARIANT_RATE:
- sprintf(buf.data(), "%s%ld/%ld", text,
- variant.rateValue.num, variant.rateValue.den);
- break;
- case amf::AMF_VARIANT_RATIO:
- sprintf(buf.data(), "%s%ld:%ld", text,
- variant.ratioValue.num, variant.ratioValue.den);
- break;
- case amf::AMF_VARIANT_COLOR:
- sprintf(buf.data(), "%s(%d,%d,%d,%d)", text,
- variant.colorValue.r,
- variant.colorValue.g,
- variant.colorValue.b,
- variant.colorValue.a);
- break;
- case amf::AMF_VARIANT_STRING:
- sprintf(buf.data(), "%s'%s'", text,
- variant.stringValue);
- break;
- case amf::AMF_VARIANT_WSTRING:
- sprintf(buf.data(), "%s'%ls'", text,
- variant.wstringValue);
- break;
- }
- AMF_LOG_INFO("%s", buf.data());
-};
-
-static void printDebugInfo(amf::AMFComponentPtr m_AMFEncoder) {
- amf::AMFPropertyInfo* pInfo;
- size_t propCount = m_AMFEncoder->GetPropertyCount();
- AMF_LOG_INFO("-- Internal AMF Encoder Properties --");
- for (size_t propIndex = 0; propIndex < propCount; propIndex++) {
- static const char* typeToString[] = {
- "Empty",
- "Boolean",
- "Int64",
- "Double",
- "Rect",
- "Size",
- "Point",
- "Rate",
- "Ratio",
- "Color",
- "String",
- "WString",
- "Interface"
- };
-
- AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(propIndex, (const amf::AMFPropertyInfo**) &pInfo);
- if (res != AMF_OK)
- continue;
- AMF_LOG_INFO(" [%ls] %ls (Type: %s, Index %Iu)",
- pInfo->name, pInfo->desc, typeToString[pInfo->type], propIndex);
- AMF_LOG_INFO(" Content Type: %d",
- pInfo->contentType);
- AMF_LOG_INFO(" Access: %s%s%s",
- (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_READ) ? "R" : "",
- (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_WRITE) ? "W" : "",
- (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_WRITE_RUNTIME) ? "X" : "");
-
- AMF_LOG_INFO(" Values:");
- amf::AMFVariantStruct curStruct = amf::AMFVariantStruct();
- m_AMFEncoder->GetProperty(pInfo->name, &curStruct);
- fastPrintVariant(" Current: ", curStruct);
- fastPrintVariant(" Default: ", pInfo->defaultValue);
- fastPrintVariant(" Minimum: ", pInfo->minValue);
- fastPrintVariant(" Maximum: ", pInfo->maxValue);
- if (pInfo->pEnumDescription) {
- AMF_LOG_INFO(" Enumeration: ");
- const amf::AMFEnumDescriptionEntry* pEnumEntry = pInfo->pEnumDescription;
- while (pEnumEntry->name != nullptr) {
- AMF_LOG_INFO(" %ls (%ld)",
- pEnumEntry->name,
- pEnumEntry->value);
- pEnumEntry++;
- }
- }
- }
-}
-#endif
-
-Plugin::AMD::H264Encoder::H264Encoder(
- H264EncoderType p_Type,
- std::string p_VideoAPI,
- uint64_t p_VideoAdapterId,
- bool p_OpenCL,
- H264ColorFormat p_SurfaceFormat/* = VCESurfaceFormat_NV12*/
-) {
- #pragma region Assign Default Values
- m_EncoderType = p_Type;
- m_ColorFormat = p_SurfaceFormat;
- m_OpenCL = p_OpenCL;
- m_Flag_IsStarted = false;
- m_Flag_FirstFrameReceived = false;
- m_Flag_FirstFrameSubmitted = false;
- m_FrameSize.first = 64; m_FrameSize.second = 64;
- m_FrameRate.first = 30; m_FrameRate.second = 1;
- m_FrameRateDivisor = ((double_t)m_FrameRate.first / (double_t)m_FrameRate.second);
- m_FrameRateReverseDivisor = ((double_t)m_FrameRate.second / (double_t)m_FrameRate.first);
- m_InputQueueLimit = (uint32_t)(m_FrameRateDivisor * 3);
- m_InputQueueLastSize = 0;
- m_TimerPeriod = 1;
- m_LastQueueWarnMessageTime = std::chrono::high_resolution_clock::time_point(std::chrono::high_resolution_clock::duration(0));
- #pragma endregion Assign Default Values
-
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Initializing...");
-
- AMF_RESULT res = AMF_OK;
- // AMF
- m_AMF = AMF::GetInstance();
- m_AMFFactory = m_AMF->GetFactory();
- /// Create an AMF context.
- if (m_AMFFactory->CreateContext(&m_AMFContext) != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> CreateContext failed with error %ls (code %ld).", res);
- /// Initialize to a specific API.
- m_API = Plugin::API::Base::GetAPIByName(p_VideoAPI);
- m_APIAdapter = m_API->GetAdapterById(p_VideoAdapterId & UINT_MAX, (p_VideoAdapterId >> 32) & UINT_MAX);
- m_APIInstance = m_API->CreateInstanceOnAdapter(m_APIAdapter);
- switch (m_API->GetType()) {
- case Plugin::API::Type::Direct3D11:
- m_MemoryType = H264MemoryType::DirectX11;
- res = m_AMFContext->InitDX11(m_API->GetContextFromInstance(m_APIInstance));
- break;
- case Plugin::API::Type::Direct3D9:
- m_MemoryType = H264MemoryType::DirectX11;
- res = m_AMFContext->InitDX9(m_API->GetContextFromInstance(m_APIInstance));
- break;
- case Plugin::API::Type::OpenGL:
- m_MemoryType = H264MemoryType::OpenGL;
- res = m_AMFContext->InitOpenGL(m_API->GetContextFromInstance(m_APIInstance), GetDesktopWindow(), nullptr);
- break;
- case Plugin::API::Type::Host:
- m_MemoryType = H264MemoryType::Host;
- m_OpenCL = false;
- break;
- }
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Initializing Video API failed with error %ls (code %ld).", res);
-
- /// Initialize OpenCL if user selected it.
- if (m_OpenCL) {
- res = m_AMFContext->InitOpenCL(nullptr);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> InitOpenCL failed with error %ls (code %ld).", res);
- m_AMFContext->GetCompute(amf::AMF_MEMORY_OPENCL, &m_AMFCompute);
- }
-
- /// Create the AMF Encoder component.
- if (m_AMFFactory->CreateComponent(m_AMFContext, Plugin::Utility::VCEEncoderTypeAsAMF(p_Type), &m_AMFEncoder) != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Creating a component object failed with error %ls (code %ld).", res);
-
- /// Create the AMF Converter component.
- if (m_AMFFactory->CreateComponent(m_AMFContext, AMFVideoConverter, &m_AMFConverter) != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Unable to create VideoConverter component, error %ls (code %ld).", res);
-
- #ifdef _DEBUG
- printDebugInfo(m_AMFEncoder);
- #endif
-
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Initialized.");
-}
-
-Plugin::AMD::H264Encoder::~H264Encoder() {
- if (m_Flag_IsStarted)
- Stop();
-
- // AMF
- if (m_AMFConverter)
- m_AMFConverter->Terminate();
- if (m_AMFEncoder)
- m_AMFEncoder->Terminate();
- if (m_AMFContext)
- m_AMFContext->Terminate();
- m_AMFConverter = nullptr;
- m_AMFEncoder = nullptr;
- m_AMFContext = nullptr;
-
- // API
- if (m_APIInstance)
- m_API->DestroyInstance(m_APIInstance);
- m_APIInstance = nullptr;
- m_API = nullptr;
-}
-
-void Plugin::AMD::H264Encoder::Start() {
- AMF_RESULT res = AMF_UNEXPECTED;
-
- // Set proper Timer resolution.
- m_TimerPeriod = 1;
- while (timeBeginPeriod(m_TimerPeriod) == TIMERR_NOCANDO) {
- ++m_TimerPeriod;
- }
-
- // Initialize Converter
- if (m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_MEMORY_TYPE, Utility::MemoryTypeAsAMF(m_MemoryType)) != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Memory Type not supported by VideoConverter component, error %ls (code %ld).", res);
- if (m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_OUTPUT_FORMAT, amf::AMF_SURFACE_NV12))
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Color Format not supported by VideoConverter component, error %ls (code %ld).", res);
- m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_COLOR_PROFILE, (size_t)this->GetColorProfile());
- if (m_AMFConverter->SetProperty(L"FullRangeColor", this->IsFullRangeColorEnabled()) != AMF_OK)
- m_AMFConverter->SetProperty(L"NominalRange", this->IsFullRangeColorEnabled());
- res = m_AMFConverter->Init(Utility::SurfaceFormatAsAMF(m_ColorFormat), m_FrameSize.first, m_FrameSize.second);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Converter initialization failed with error %ls (code %ld).", res);
-
- // Initialize Encoder
- res = m_AMFEncoder->Init(amf::AMF_SURFACE_NV12,
- m_FrameSize.first, m_FrameSize.second);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Encoder initialization failed with error %ls (code %ld).", res);
-
- m_Flag_IsStarted = true;
-
- // Threading
- m_Input.thread = std::thread(&(Plugin::AMD::H264Encoder::InputThreadMain), this);
- m_Output.thread = std::thread(&(Plugin::AMD::H264Encoder::OutputThreadMain), this);
-
- #ifdef _DEBUG
- printDebugInfo(m_AMFEncoder);
- #endif
-}
-
-void Plugin::AMD::H264Encoder::Restart() {
- if (!m_Flag_IsStarted)
- return;
-
- std::unique_lock<std::mutex> ilock(m_Input.mutex);
- std::unique_lock<std::mutex> olock(m_Output.mutex);
-
- // Create Encoder
- AMF_RESULT res = m_AMFEncoder->ReInit(m_FrameSize.first, m_FrameSize.second);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Initialization failed with error %ls (code %ld).", res);
-}
-
-void Plugin::AMD::H264Encoder::Stop() {
- // Restore Timer precision.
- if (m_TimerPeriod != 0) {
- timeEndPeriod(m_TimerPeriod);
- }
-
- m_Flag_IsStarted = false;
-
- // Threading
- m_Output.condvar.notify_all();
- #if defined _WIN32 || defined _WIN64
- { // Windows: Force terminate Thread after 1 second of waiting.
- std::thread::native_handle_type hnd = m_Output.thread.native_handle();
-
- uint32_t res = WaitForSingleObject((HANDLE)hnd, 1000);
- switch (res) {
- case WAIT_OBJECT_0:
- m_Output.thread.join();
- break;
- default:
- m_Output.thread.detach();
- TerminateThread((HANDLE)hnd, 0);
- break;
- }
- }
- #else
- m_Output.thread.join();
- #endif
-
- m_Input.condvar.notify_all();
- #if defined _WIN32 || defined _WIN64
- { // Windows: Force terminate Thread after 1 second of waiting.
- std::thread::native_handle_type hnd = m_Input.thread.native_handle();
-
- uint32_t res = WaitForSingleObject((HANDLE)hnd, 1000);
- switch (res) {
- case WAIT_OBJECT_0:
- m_Input.thread.join();
- break;
- default:
- m_Input.thread.detach();
- TerminateThread((HANDLE)hnd, 0);
- break;
- }
- }
- #else
- m_Input.thread.join();
- #endif
-
- // Stop AMF Encoder
- if (m_AMFEncoder) {
- m_AMFEncoder->Drain();
- m_AMFEncoder->Flush();
- }
-
- // Clear Queues, Data
- std::queue<amf::AMFSurfacePtr>().swap(m_Input.queue);
- std::queue<amf::AMFDataPtr>().swap(m_Output.queue);
- m_PacketDataBuffer.clear();
- m_ExtraDataBuffer.clear();
-}
-
-bool Plugin::AMD::H264Encoder::IsStarted() {
- return m_Flag_IsStarted;
-}
-
-bool Plugin::AMD::H264Encoder::SendInput(struct encoder_frame* frame) {
- // Early-Exception if not encoding.
- if (!m_Flag_IsStarted) {
- const char* error = "<" __FUNCTION_NAME__ "> Attempted to send input while not running.";
- AMF_LOG_ERROR("%s", error);
- throw std::exception(error);
- }
-
- /* Performance Monitoring */ std::chrono::high_resolution_clock::time_point tpSend = std::chrono::high_resolution_clock::now();
-
- // Attempt to queue for 1 second (forces "Encoding overloaded" message to appear).
- bool queueSuccessful = false;
- auto queueStart = std::chrono::high_resolution_clock::now();
- auto queueDuration = std::chrono::nanoseconds((uint64_t)floor(m_FrameRateReverseDivisor * 1000000));
- size_t queueSize = m_InputQueueLimit;
- do {
- // Wake up submission thread.
- m_Input.condvar.notify_all();
-
- {
- std::unique_lock<std::mutex> qlock(m_Input.queuemutex);
- queueSize = m_Input.queue.size();
- }
-
- // Push into queue if it has room.
- if (queueSize < m_InputQueueLimit) {
- /* Performance Monitoring */ std::chrono::high_resolution_clock::time_point tpCreateSurface = std::chrono::high_resolution_clock::now();
- amf::AMFSurfacePtr pAMFSurface = CreateSurfaceFromFrame(frame);
- /* Performance Monitoring */ auto timeCreate = std::chrono::high_resolution_clock::now() - tpCreateSurface;
- if (!pAMFSurface) {
- AMF_LOG_ERROR("Unable copy frame for submission, terminating...");
- return false;
- } else {
- pAMFSurface->SetPts(frame->pts / m_FrameRate.second);
- pAMFSurface->SetDuration((uint64_t)ceil(m_FrameRateReverseDivisor * AMF_SECOND));
- pAMFSurface->SetProperty(AMF_PROPERTY_FRAME, frame->pts);
- /* Performance Monitoring */ pAMFSurface->SetProperty(AMF_PROPERTY_TIME_SENDINPUT, std::chrono::nanoseconds(tpSend.time_since_epoch()).count());
- /* Performance Monitoring */ pAMFSurface->SetProperty(AMF_PROPERTY_TIME_CREATESURFACE, std::chrono::nanoseconds(timeCreate).count());
- /* Performance Monitoring */ pAMFSurface->SetProperty(AMF_PROPERTY_TIME_CONVERT, 0);
- /* Performance Monitoring */ pAMFSurface->SetProperty(AMF_PROPERTY_TIME_ENCODE, 0);
- }
-
- {
- std::unique_lock<std::mutex> qlock(m_Input.queuemutex);
- m_Input.queue.push(pAMFSurface);
- queueSize++;
- }
- queueSuccessful = true;
- break;
- }
-
- // Sleep
- std::this_thread::sleep_for(queueDuration / 4);
- } while ((queueSuccessful == false) && (std::chrono::high_resolution_clock::now() - queueStart <= queueDuration));
-
- // Report status.
- auto timedelta = std::chrono::high_resolution_clock::now() - m_LastQueueWarnMessageTime;
- if (timedelta >= std::chrono::seconds(1)) { // Only show these messages once per second.
- if (queueSuccessful) {
- int32_t queueSizeDelta = ((int32_t)m_InputQueueLastSize - (int32_t)queueSize);
-
- if (queueSizeDelta >= 5) {
- AMF_LOG_INFO("GPU Encoder is catching up, queue is shrinking... (%Iu,%+ld,%Iu)",
- m_InputQueueLastSize,
- queueSizeDelta,
- queueSize);
- m_InputQueueLastSize = queueSize;
- } else if (queueSizeDelta <= -5) {
- AMF_LOG_WARNING("GPU Encoder overloaded, queue is growing... (%Iu,%+ld,%Iu)",
- m_InputQueueLastSize, queueSizeDelta, queueSize);
- m_InputQueueLastSize = queueSize;
- }
- } else {
- AMF_LOG_ERROR("GPU Encoder overloaded, dropping frame instead...");
- }
- m_LastQueueWarnMessageTime = std::chrono::high_resolution_clock::now();
- }
-
- /// Signal Thread Wakeup
- m_Input.condvar.notify_all();
-
- // WORKAROUND: Block for at most 5 seconds until the first frame has been submitted.
- if (!m_Flag_FirstFrameSubmitted) {
- auto startsubmit = std::chrono::high_resolution_clock::now();
- auto diff = std::chrono::high_resolution_clock::now() - startsubmit;
- do {
- diff = std::chrono::high_resolution_clock::now() - startsubmit;
- std::this_thread::sleep_for(std::chrono::milliseconds(1));
- } while ((diff <= std::chrono::seconds(5)) && !m_Flag_FirstFrameSubmitted);
- if (!m_Flag_FirstFrameSubmitted)
- throw std::exception("Unable to submit first frame, terminating...");
- else {
- uint64_t dtime = (uint64_t)diff.count();
- AMF_LOG_INFO("First submission took %" PRIu64 ".%" PRIu64 " seconds.", dtime / 1000000000, dtime % 1000000000);
- }
- }
-
- return true;
-}
-
-bool Plugin::AMD::H264Encoder::GetOutput(struct encoder_packet* packet, bool* received_packet) {
- // Early-Exception if not encoding.
- if (!m_Flag_IsStarted) {
- const char* error = "<" __FUNCTION_NAME__ "> Attempted to send input while not running.";
- AMF_LOG_ERROR("%s", error);
- throw std::exception(error);
- }
-
- /* Performance Monitoring */ std::chrono::high_resolution_clock::time_point tpRetrieve = std::chrono::high_resolution_clock::now();
-
- // Signal Output Thread to wake up.
- m_Output.condvar.notify_all();
-
- // Dequeue a Packet
- bool queueSuccessful = false;
- auto queueStart = std::chrono::high_resolution_clock::now();
- auto queueDuration = std::chrono::nanoseconds((uint64_t)floor(m_FrameRateReverseDivisor * 1000000));
- do {
- std::unique_lock<std::mutex> qlock(m_Output.queuemutex);
- if (m_Output.queue.size() != 0)
- queueSuccessful = true;
-
- // Sleep
- std::this_thread::sleep_for(queueDuration / 4);
- } while ((queueSuccessful == false) && (std::chrono::high_resolution_clock::now() - queueStart <= queueDuration));
- if (!queueSuccessful)
- return true;
-
- // We've got a DataPtr, let's use it.
- {
- amf::AMFDataPtr pAMFData;
- {
- std::unique_lock<std::mutex> qlock(m_Output.queuemutex);
- pAMFData = m_Output.queue.front();
- m_Output.queue.pop();
- }
- amf::AMFBufferPtr pAMFBuffer = amf::AMFBufferPtr(pAMFData);
-
- // Assemble Packet
- packet->type = OBS_ENCODER_VIDEO;
- /// Data
- packet->size = pAMFBuffer->GetSize();
- if (m_PacketDataBuffer.size() < packet->size) {
- size_t newBufferSize = (size_t)exp2(ceil(log2(packet->size)));
- AMF_LOG_DEBUG("Packet Buffer was resized to %Iu byte from %Iu byte.", newBufferSize, m_PacketDataBuffer.size());
- m_PacketDataBuffer.resize(newBufferSize);
- }
- packet->data = m_PacketDataBuffer.data();
- std::memcpy(packet->data, pAMFBuffer->GetNative(), packet->size);
- /// Timestamps
- packet->dts = (pAMFData->GetPts() - 2) * m_FrameRate.second; // Offset by 2 to support B-Frames
- pAMFBuffer->GetProperty(L"Frame", &packet->pts);
- { /// Packet Priority & Keyframe
- uint64_t pktType;
- pAMFData->GetProperty(AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE, &pktType);
-
- switch ((AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_ENUM)pktType) {
- case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_IDR://
- packet->keyframe = true; // IDR-Frames are Key-Frames that contain a lot of information.
- packet->priority = 3; // Highest priority, always continue streaming with these.
- //packet->drop_priority = 3; // Dropped IDR-Frames can only be replaced by the next IDR-Frame.
- break;
- case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_I: // I-Frames need only a previous I- or IDR-Frame.
- packet->priority = 2; // I- and IDR-Frames will most likely be present.
- // packet->drop_priority = 2; // So we can continue with a I-Frame when streaming.
- // break;
- case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_P: // P-Frames need either a previous P-, I- or IDR-Frame.
- packet->priority = 1; // We can safely assume that at least one of these is present.
- // packet->drop_priority = 1; // So we can continue with a P-Frame when streaming.
- // break;
- case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_B: // B-Frames need either a parent B-, P-, I- or IDR-Frame.
- packet->priority = 0; // We don't know if the last non-dropped frame was a B-Frame.
- // packet->drop_priority = 1; // So require a P-Frame or better to continue streaming.
- // break;
- }
- }
- *received_packet = true;
-
- // Debug: Packet Information
- /// Convert File Name and Function Name
- static std::vector<wchar_t> fileName(2048);
- mbstowcs(fileName.data(), __FILE__, fileName.size());
-
- /// Timing Information
- uint64_t debugPacketType, debugDTS, debugPTS, debugDuration,
- debugTimeSend,
- debugTimeCreate,
- debugTimeConvert,
- debugTimeEncode;
- pAMFData->GetProperty(AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE, &debugPacketType);
- debugDTS = pAMFData->GetPts();
- pAMFData->GetProperty(AMF_PROPERTY_FRAME, &debugPTS);
- debugDuration = pAMFData->GetDuration();
- pAMFData->GetProperty(AMF_PROPERTY_TIME_SENDINPUT, &debugTimeSend);
- pAMFData->GetProperty(AMF_PROPERTY_TIME_CREATESURFACE, &debugTimeCreate);
- pAMFData->GetProperty(AMF_PROPERTY_TIME_CONVERT, &debugTimeConvert);
- pAMFData->GetProperty(AMF_PROPERTY_TIME_ENCODE, &debugTimeEncode);
- uint64_t totalTimeSendRetrieve = std::chrono::nanoseconds(tpRetrieve.time_since_epoch()).count() - debugTimeSend;
-
- /// All times are in nanoseconds.
- /// Frame DTS() PTS() Duration() Type() TimeCreate() TimeConvert(SUBMIT,QUERY,TOTAL) TimeEncode(SUBMIT,ENCODE,QUERY,TOTAL) TimeSendToRetrieve()
- m_AMF->GetTrace()->TraceW(
- fileName.data(), __LINE__,
- AMF_TRACE_TRACE, L"Performance Tracking", 9,
- L"Frame DTS(%8lld) PTS(%8lld) Duration(%8lld) Type(%1lld) Size(%8lld) TimeCreate(%8lld) TimeConvert(%8lld) TimeEncode(%8lld) TimeSendToRetrieve(%8lld)",
- (uint64_t)debugDTS,
- (uint64_t)debugPTS,
- (uint64_t)debugDuration,
- (uint64_t)debugPacketType,
- (uint64_t)packet->size,
- (uint64_t)debugTimeCreate,
- (uint64_t)debugTimeConvert,
- (uint64_t)debugTimeEncode,
- (uint64_t)totalTimeSendRetrieve);
- }
-
- return true;
-}
-
-bool Plugin::AMD::H264Encoder::GetExtraData(uint8_t**& extra_data, size_t*& extra_data_size) {
- if (!m_AMFContext || !m_AMFEncoder)
- throw std::exception("<" __FUNCTION_NAME__ "> Called while not initialized.");
-
- if (!m_Flag_IsStarted)
- throw std::exception("<" __FUNCTION_NAME__ "> Called while not encoding.");
-
- amf::AMFVariant var;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_EXTRADATA, &var);
- if (res == AMF_OK && var.type == amf::AMF_VARIANT_INTERFACE) {
- amf::AMFBufferPtr buf(var.pInterface);
-
- *extra_data_size = buf->GetSize();
- m_ExtraDataBuffer.resize(*extra_data_size);
- std::memcpy(m_ExtraDataBuffer.data(), buf->GetNative(), *extra_data_size);
- *extra_data = m_ExtraDataBuffer.data();
-
- return true;
- }
- return false;
-}
-
-void Plugin::AMD::H264Encoder::GetVideoInfo(struct video_scale_info*& vsi) {
- if (!m_AMFContext || !m_AMFEncoder)
- throw std::exception("<" __FUNCTION_NAME__ "> Called while not initialized.");
-
- if (!m_Flag_IsStarted)
- throw std::exception("<" __FUNCTION_NAME__ "> Called while not encoding.");
-
- switch (m_ColorFormat) {
- // 4:2:0 Formats
- case H264ColorFormat::NV12:
- vsi->format = VIDEO_FORMAT_NV12;
- break;
- case H264ColorFormat::I420:
- vsi->format = VIDEO_FORMAT_I420;
- break;
- // 4:2:2 Formats
- case H264ColorFormat::YUY2:
- vsi->format = VIDEO_FORMAT_YUY2;
- break;
- // Uncompressed
- case H264ColorFormat::RGBA:
- vsi->format = VIDEO_FORMAT_RGBA;
- break;
- case H264ColorFormat::BGRA:
- vsi->format = VIDEO_FORMAT_BGRA;
- break;
- // Other
- case H264ColorFormat::GRAY:
- vsi->format = VIDEO_FORMAT_Y800;
- break;
- }
-
- // AMF requires Partial Range for some reason.
- if (this->IsFullRangeColorEnabled()) { // Only use Full range if actually enabled.
- vsi->range = VIDEO_RANGE_FULL;
- } else {
- vsi->range = VIDEO_RANGE_PARTIAL;
- }
-}
-
-void Plugin::AMD::H264Encoder::InputThreadMain(Plugin::AMD::H264Encoder* p_this) {
- p_this->InputThreadLogic();
-}
-
-void Plugin::AMD::H264Encoder::OutputThreadMain(Plugin::AMD::H264Encoder* p_this) {
- p_this->OutputThreadLogic();
-}
-
-void Plugin::AMD::H264Encoder::InputThreadLogic() { // Thread Loop that handles Surface Submission
- // Assign Thread Name
- static const char* __threadName = "enc-amf Input Thread";
- SetThreadName(__threadName);
-
- // Core Loop
- std::unique_lock<std::mutex> lock(m_Input.mutex);
- uint32_t repeatSurfaceSubmission = 0;
- do {
- m_Input.condvar.wait(lock);
-
- // Skip to check if isStarted is false.
- if (!m_Flag_IsStarted)
- continue;
-
- // Dequeue Surface
- amf::AMFSurfacePtr surface;
- {
- std::unique_lock<std::mutex> qlock(m_Input.queuemutex);
- if (m_Input.queue.size() == 0)
- continue; // Queue is empty,
- surface = m_Input.queue.front();
- }
-
- /// Convert Frame
- AMF_RESULT res;
- amf::AMFDataPtr outbuf;
-
- /* Performance Monitoring */ std::chrono::high_resolution_clock::time_point tpConvert = std::chrono::high_resolution_clock::now();
- res = m_AMFConverter->SubmitInput(surface);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Unable to submit Frame to Converter, error %ls (code %ld).", res);
- res = m_AMFConverter->QueryOutput(&outbuf);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Unable to retrieve Frame from Converter, error %ls (code %ld).", res);
- /* Performance Monitoring */ outbuf->SetProperty(AMF_PROPERTY_TIME_CONVERT,
- std::chrono::nanoseconds(std::chrono::high_resolution_clock::now() - tpConvert).count());
-
- /// Submit to AMF
- /* Performance Monitoring */ outbuf->SetProperty(AMF_PROPERTY_TIME_ENCODE, std::chrono::nanoseconds(std::chrono::high_resolution_clock::now().time_since_epoch()).count());
- res = m_AMFEncoder->SubmitInput(outbuf);
- if (res == AMF_OK) {
- m_Flag_FirstFrameSubmitted = true;
-
- { // Remove Surface from Queue
- std::unique_lock<std::mutex> qlock(m_Input.queuemutex);
- m_Input.queue.pop();
- }
-
- // Reset AMF_INPUT_FULL retries.
- repeatSurfaceSubmission = 0;
-
- // Continue with next Surface.
- m_Input.condvar.notify_all();
- } else if (res == AMF_INPUT_FULL) {
- if (repeatSurfaceSubmission < 5) {
- repeatSurfaceSubmission++;
- m_Input.condvar.notify_all();
- m_Output.condvar.notify_all();
- }
- } else if (res == AMF_EOF) {
- // This should never happen, but on the off-chance that it does, just straight up leave the loop.
- break;
- } else {
- // Unknown, unexpected return code.
- std::vector<char> msgBuf(128);
- FormatTextWithAMFError(&msgBuf, "%ls (code %d)", Plugin::AMD::AMF::GetInstance()->GetTrace()->GetResultText(res), res);
- AMF_LOG_WARNING("<" __FUNCTION_NAME__ "> SubmitInput failed with error %s.", msgBuf.data());
- }
-
- std::this_thread::sleep_for(std::chrono::milliseconds(m_TimerPeriod));
- } while (m_Flag_IsStarted);
-}
-
-void Plugin::AMD::H264Encoder::OutputThreadLogic() { // Thread Loop that handles Querying
- // Assign Thread Name
- static const char* __threadName = "enc-amf Output Thread";
- SetThreadName(__threadName);
-
- // Core Loop
- std::unique_lock<std::mutex> lock(m_Output.mutex);
- do {
- m_Output.condvar.wait(lock);
-
- // Skip to check if isStarted is false.
- if (!m_Flag_IsStarted)
- continue;
-
- amf::AMFDataPtr pData = amf::AMFDataPtr();
- AMF_RESULT res = m_AMFEncoder->QueryOutput(&pData);
- if (res == AMF_OK) {
- m_Flag_FirstFrameReceived = true;
-
- uint64_t debugTimeEncode = 0;
- pData->GetProperty(AMF_PROPERTY_TIME_ENCODE, &debugTimeEncode);
- /* Performance Monitoring */ pData->SetProperty(AMF_PROPERTY_TIME_ENCODE,
- (uint64_t)(
- std::chrono::high_resolution_clock::now().time_since_epoch().count() -
- debugTimeEncode
- ));
-
- { // Queue
- std::unique_lock<std::mutex> qlock(m_Output.queuemutex);
- m_Output.queue.push(pData);
- }
-
- // Continue querying until nothing is left.
- m_Output.condvar.notify_all();
- } else if (res == AMF_REPEAT) {
- m_Input.condvar.notify_all();
- } else if (res == AMF_EOF) {
- // This should never happen, but on the off-chance that it does, just straight up leave the loop.
- break;
- } else {
- // Unknown, unexpected return code.
- std::vector<char> msgBuf(128);
- FormatTextWithAMFError(&msgBuf, "%s (code %d)", res);
- AMF_LOG_WARNING("<" __FUNCTION_NAME__ "> QueryOutput failed with error %s.", msgBuf.data());
- }
-
- std::this_thread::sleep_for(std::chrono::milliseconds(m_TimerPeriod));
- } while (m_Flag_IsStarted);
-}
-
-inline amf::AMFSurfacePtr Plugin::AMD::H264Encoder::CreateSurfaceFromFrame(struct encoder_frame*& frame) {
- AMF_RESULT res = AMF_UNEXPECTED;
- amf::AMFSurfacePtr pSurface = nullptr;
- if (m_OpenCL) {
- amf_size l_origin[] = { 0, 0, 0 };
- amf_size l_size0[] = { m_FrameSize.first, m_FrameSize.second, 1 };
- amf_size l_size1[] = { m_FrameSize.first >> 1, m_FrameSize.second >> 1, 1 };
-
- res = m_AMFContext->AllocSurface(Utility::MemoryTypeAsAMF(m_MemoryType),
- Utility::SurfaceFormatAsAMF(m_ColorFormat),
- m_FrameSize.first, m_FrameSize.second, &pSurface);
- if (res != AMF_OK) // Unable to create Surface
- ThrowExceptionWithAMFError("AllocSurface failed with error %ls (code %d).", res);
-
- amf::AMFComputeSyncPointPtr pSyncPoint;
- m_AMFCompute->PutSyncPoint(&pSyncPoint);
- pSurface->Convert(amf::AMF_MEMORY_OPENCL);
- m_AMFCompute->CopyPlaneFromHost(frame->data[0], l_origin, l_size0, frame->linesize[0], pSurface->GetPlaneAt(0), false);
- m_AMFCompute->CopyPlaneFromHost(frame->data[1], l_origin, l_size1, frame->linesize[1], pSurface->GetPlaneAt(1), false);
- m_AMFCompute->FinishQueue();
- pSurface->Convert(Utility::MemoryTypeAsAMF(m_MemoryType));
- pSyncPoint->Wait();
- } else {
- res = m_AMFContext->AllocSurface(amf::AMF_MEMORY_HOST, Utility::SurfaceFormatAsAMF(m_ColorFormat),
- m_FrameSize.first, m_FrameSize.second, &pSurface);
- if (res != AMF_OK) // Unable to create Surface
- ThrowExceptionWithAMFError("AllocSurface failed with error %ls (code %d).", res);
-
- size_t planeCount = pSurface->GetPlanesCount();
- #pragma loop(hint_parallel(2))
- for (uint8_t i = 0; i < planeCount; i++) {
- amf::AMFPlanePtr plane = pSurface->GetPlaneAt(i);
- void* plane_nat = plane->GetNative();
- int32_t height = plane->GetHeight();
- int32_t hpitch = plane->GetHPitch();
-
- for (int32_t py = 0; py < height; py++) {
- int32_t plane_off = py * hpitch;
- int32_t frame_off = py * frame->linesize[i];
- std::memcpy(
- static_cast<void*>(static_cast<uint8_t*>(plane_nat) + plane_off),
- static_cast<void*>(frame->data[i] + frame_off), frame->linesize[i]);
- }
- }
-
- // Convert to AMF native type.
- pSurface->Convert(Utility::MemoryTypeAsAMF(m_MemoryType));
- }
-
- return pSurface;
-}
-
-//////////////////////////////////////////////////////////////////////////
-// AMF Properties
-//////////////////////////////////////////////////////////////////////////
-
-void Plugin::AMD::H264Encoder::LogProperties() {
- AMF_LOG_INFO("-- AMD Advanced Media Framework Encoder --");
-
- // Initialization Properties
- AMF_LOG_INFO("Initialization Properties: ");
- AMF_LOG_INFO(" Type: %s", Utility::VCEEncoderTypeAsString(m_EncoderType));
- AMF_LOG_INFO(" Video API: %s", Utility::MemoryTypeAsString(m_MemoryType));
- if (m_MemoryType != H264MemoryType::Host) {
- AMF_LOG_INFO(" Video Adapter: %s", m_APIAdapter.Name.c_str());
- AMF_LOG_INFO(" OpenCL: %s", m_OpenCL ? "Enabled" : "Disabled");
- }
- AMF_LOG_INFO(" Color Format: %s", Utility::SurfaceFormatAsString(m_ColorFormat));
-
- // Startup Properties
- AMF_LOG_INFO("Startup Properties: ");
- AMF_LOG_INFO(" Usage: %s", Utility::UsageAsString(GetUsage()));
- AMF_LOG_INFO(" Quality Preset: %s", Utility::QualityPresetAsString(GetQualityPreset()));
- uint8_t profileLevel = (uint8_t)this->GetProfileLevel();
- AMF_LOG_INFO(" Profile: %s %d.%d", Utility::ProfileAsString(GetProfile()), profileLevel / 10, profileLevel % 10);
-
- // Frame Properties
- AMF_LOG_INFO("Frame Properties: ");
- try {
- AMF_LOG_INFO(" Color Profile: %s", GetColorProfile() == H264ColorProfile::Rec709 ? "709" : "601");
- } catch (...) {
- AMF_LOG_INFO(" Color Profile: N/A");
- }
- try {
- AMF_LOG_INFO(" Color Range: %s", IsFullRangeColorEnabled() ? "Full" : "Partial");
- } catch (...) {
- AMF_LOG_INFO(" Color Range: N/A");
- }
- AMF_LOG_INFO(" Resolution: %dx%d", GetResolution().first, GetResolution().second);
- AMF_LOG_INFO(" Frame Rate: %d/%d", GetFrameRate().first, GetFrameRate().second);
- AMF_LOG_INFO(" Scan Type: %s", GetScanType() == H264ScanType::Progressive ? "Progressive" : "Interlaced");
-
- // Rate Control Properties
- AMF_LOG_INFO("Rate Control Properties: ");
- if (GetUsage() != H264Usage::UltraLowLatency) {
- AMF_LOG_INFO(" Method: %s", Utility::RateControlMethodAsString(GetRateControlMethod()));
- } else {
- AMF_LOG_INFO(" Method: Ultra Low Latency");
- }
- AMF_LOG_INFO(" Bitrate: ");
- AMF_LOG_INFO(" Target: %d bits", GetTargetBitrate());
- if (GetUsage() != H264Usage::UltraLowLatency) {
- AMF_LOG_INFO(" Peak: %d bits", GetPeakBitrate());
- } else {
- AMF_LOG_INFO(" Peak: Ultra Low Latency");
- }
- AMF_LOG_INFO(" Quantization Parameter: ");
- AMF_LOG_INFO(" Minimum: %d", GetMinimumQP());
- AMF_LOG_INFO(" Maximum: %d", GetMaximumQP());
- AMF_LOG_INFO(" I-Frame: %d", GetIFrameQP());
- AMF_LOG_INFO(" P-Frame: %d", GetPFrameQP());
- if (VCECapabilities::GetInstance()->GetAdapterCapabilities(m_API, m_APIAdapter, H264EncoderType::AVC).supportsBFrames) {
- try { AMF_LOG_INFO(" B-Frame: %d", GetBFrameQP()); } catch (...) {}
- } else {
- AMF_LOG_INFO(" B-Frame: N/A");
- }
- AMF_LOG_INFO(" VBV Buffer: ");
- AMF_LOG_INFO(" Size: %d bits", GetVBVBufferSize());
- if (GetUsage() != H264Usage::UltraLowLatency) {
- AMF_LOG_INFO(" Initial Fullness: %f%%", GetInitialVBVBufferFullness() * 100.0);
- } else {
- AMF_LOG_INFO(" Initial Fullness: Ultra Low Latency");
- }
- AMF_LOG_INFO(" Flags: ");
- if (GetUsage() != H264Usage::UltraLowLatency) {
- AMF_LOG_INFO(" Filler Data: %s", IsFillerDataEnabled() ? "Enabled" : "Disabled");
- AMF_LOG_INFO(" Frame Skipping: %s", IsFrameSkippingEnabled() ? "Enabled" : "Disabled");
- AMF_LOG_INFO(" Enforce HRD Restrictions: %s", IsEnforceHRDRestrictionsEnabled() ? "Enabled" : "Disabled");
- } else {
- AMF_LOG_INFO(" Filler Data: Ultra Low Latency");
- AMF_LOG_INFO(" Frame Skipping: Ultra Low Latency");
- AMF_LOG_INFO(" Enforce HRD Restrictions: Ultra Low Latency");
- }
-
- // Picture Control Properties
- AMF_LOG_INFO("Picture Control Properties: ");
- AMF_LOG_INFO(" IDR Period: %d frames", GetIDRPeriod());
- if (VCECapabilities::GetInstance()->GetAdapterCapabilities(m_API, m_APIAdapter, H264EncoderType::AVC).supportsBFrames) {
- AMF_LOG_INFO(" B-Frame Pattern: %d", GetBFramePattern());
- try {
- AMF_LOG_INFO(" B-Frame Delta QP: %d", GetBFrameDeltaQP());
- } catch (...) {
- AMF_LOG_INFO(" B-Frame Delta QP: N/A");
- }
- if (GetUsage() == H264Usage::Transcoding) {
- AMF_LOG_INFO(" B-Frame Reference: %s", IsBFrameReferenceEnabled() ? "Enabled" : "Disabled");
- try {
- AMF_LOG_INFO(" B-Frame Reference Delta QP: %d", GetBFrameReferenceDeltaQP());
- } catch (...) {
- AMF_LOG_INFO(" B-Frame Reference Delta QP: N/A");
- }
- } else {
- AMF_LOG_INFO(" B-Frame Reference: Low Latency Mode");
- AMF_LOG_INFO(" B-Frame Reference Delta QP: Low Latency Mode");
- }
- } else {
- AMF_LOG_INFO(" B-Frame Pattern: N/A");
- AMF_LOG_INFO(" B-Frame Delta QP: N/A");
- AMF_LOG_INFO(" B-Frame Reference: N/A");
- AMF_LOG_INFO(" B-Frame Reference Delta QP: N/A");
- }
-
- AMF_LOG_INFO("Miscellaneous Properties: ");
- if (GetUsage() == H264Usage::Transcoding) {
- AMF_LOG_INFO(" Deblocking Filter: %s", IsDeblockingFilterEnabled() ? "Enabled" : "Disabled");
- } else {
- AMF_LOG_INFO(" Deblocking Filter: Low Latency Mode");
- }
- AMF_LOG_INFO(" Motion Estimation: %s",
- (this->IsHalfPixelMotionEstimationEnabled()
- ? (this->IsQuarterPixelMotionEstimationEnabled()
- ? "Half & Quarter Pixel"
- : "Half Pixel")
- : (this->IsQuarterPixelMotionEstimationEnabled()
- ? "Quarter Pixel"
- : "None")
- )
- );
-
- AMF_LOG_INFO("Experimental Properties: ");
- try { AMF_LOG_INFO(" Maximum MB/s: %d", GetMaxMBPerSec()); } catch (...) {}
- try { AMF_LOG_INFO(" Coding Type: %s", Utility::CodingTypeAsString(GetCodingType())); } catch (...) {}
- try { AMF_LOG_INFO(" Wait For Task: %s", IsWaitForTaskEnabled() ? "Enabled" : "Disabled"); } catch (...) {}
- try { AMF_LOG_INFO(" Pre-Analyiss Pass: %s", IsPreAnalysisPassEnabled() ? "Enabled" : "Disabled"); } catch (...) {}
- try { AMF_LOG_INFO(" VBAQ: %s", IsVBAQEnabled() ? "Enabled" : "Disabled"); } catch (...) {}
- try { AMF_LOG_INFO(" Header Insertion Spacing: %d frames", GetHeaderInsertionSpacing()); } catch (...) {}
- try { AMF_LOG_INFO(" Maximum Long-Term Reference Frames: %d", GetMaximumLongTermReferenceFrames()); } catch (...) {}
- try { AMF_LOG_INFO(" Maximum Access Unit Size: %d bits", GetMaximumAccessUnitSize()); } catch (...) {}
- try { AMF_LOG_INFO(" Maximum Reference Frames: %d", GetMaximumReferenceFrames()); } catch (...) {}
- try { AMF_LOG_INFO(" Aspect Ratio: %d:%d", GetAspectRatio().first, this->GetAspectRatio().second); } catch (...) {}
- try { AMF_LOG_INFO(" GOP Size: %d", GetGOPSize()) } catch (...) {}
- try { AMF_LOG_INFO(" GOP Alignment: %s", IsGOPAlignementEnabled() ? "Enabled" : "Disabled") } catch (...) {}
- try { AMF_LOG_INFO(" Intra-Refresh Macroblocks Pro Slot: %d", GetIntraRefreshMacroblocksPerSlot()) } catch (...) {}
- try { AMF_LOG_INFO(" Intra-Refresh Number Of Stripes: %d", GetIntraRefreshNumberOfStripes()) } catch (...) {}
- try { AMF_LOG_INFO(" Slices Per Frame: %d", GetSlicesPerFrame()); } catch (...) {}
- try { AMF_LOG_INFO(" Slice Mode: %s", Utility::SliceModeAsString(GetSliceMode())); } catch (...) {}
- try { AMF_LOG_INFO(" Maximum Slice Size: %d", GetMaximumSliceSize()); } catch (...) {}
- try { AMF_LOG_INFO(" Slice Control Mode: %s", Utility::SliceControlModeAsString(GetSliceControlMode())); } catch (...) {}
- try { AMF_LOG_INFO(" Slice Control Size: %d", GetSliceControlSize()); } catch (...) {}
-
- Plugin::AMD::VCECapabilities::ReportAdapterCapabilities(m_API, m_APIAdapter);
-
- #ifdef _DEBUG
- printDebugInfo(m_AMFEncoder);
- #endif
-
- AMF_LOG_INFO("-- AMD Advanced Media Framework VCE Encoder --");
-}
-
-void Plugin::AMD::H264Encoder::SetUsage(H264Usage usage) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_USAGE,
- (uint32_t)Utility::UsageAsAMF(usage));
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).",
- res, Utility::UsageAsString(usage));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", Utility::UsageAsString(usage));
-}
-
-Plugin::AMD::H264Usage Plugin::AMD::H264Encoder::GetUsage() {
- uint32_t usage;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_USAGE, &usage);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.",
- Utility::UsageAsString(Utility::UsageFromAMF(usage)));
- return Utility::UsageFromAMF(usage);
-}
-
-void Plugin::AMD::H264Encoder::SetQualityPreset(H264QualityPreset preset) {
- static AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM CustomToAMF[] = {
- AMF_VIDEO_ENCODER_QUALITY_PRESET_SPEED,
- AMF_VIDEO_ENCODER_QUALITY_PRESET_BALANCED,
- AMF_VIDEO_ENCODER_QUALITY_PRESET_QUALITY,
- };
- static char* CustomToName[] = {
- "Speed",
- "Balanced",
- "Quality",
- };
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QUALITY_PRESET, (uint32_t)CustomToAMF[(uint8_t)preset]);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, CustomToName[(uint8_t)preset]);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", CustomToName[(uint8_t)preset]);
-}
-
-Plugin::AMD::H264QualityPreset Plugin::AMD::H264Encoder::GetQualityPreset() {
- static H264QualityPreset AMFToCustom[] = {
- H264QualityPreset::Balanced,
- H264QualityPreset::Speed,
- H264QualityPreset::Quality,
- };
- static char* CustomToName[] = {
- "Speed",
- "Balanced",
- "Quality",
- };
-
- uint32_t preset;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QUALITY_PRESET, &preset);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", CustomToName[(uint8_t)AMFToCustom[(uint8_t)preset]]);
- return AMFToCustom[(uint8_t)preset];
-}
-
-void Plugin::AMD::H264Encoder::SetProfile(H264Profile profile) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PROFILE, (uint32_t)profile);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, Utility::ProfileAsString(profile));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", Utility::ProfileAsString(profile));
-}
-
-Plugin::AMD::H264Profile Plugin::AMD::H264Encoder::GetProfile() {
- uint32_t profile;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PROFILE, &profile);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", Utility::ProfileAsString((H264Profile)profile));
- return (H264Profile)profile;
-}
-
-void Plugin::AMD::H264Encoder::SetProfileLevel(H264ProfileLevel level) {
- // Automatic Detection
- if (level == H264ProfileLevel::Automatic) {
- auto frameSize = this->GetResolution();
- auto frameRate = this->GetFrameRate();
- level = Plugin::Utility::GetMinimumProfileLevel(frameSize, frameRate);
- }
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PROFILE_LEVEL, (uint32_t)level);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, level);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", level);
-}
-
-Plugin::AMD::H264ProfileLevel Plugin::AMD::H264Encoder::GetProfileLevel() {
- uint32_t profileLevel;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PROFILE_LEVEL, &profileLevel);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", profileLevel);
- return (H264ProfileLevel)(profileLevel);
-}
-
-void Plugin::AMD::H264Encoder::SetColorProfile(H264ColorProfile profile) {
- AMF_VIDEO_CONVERTER_COLOR_PROFILE_ENUM pluginToAMF[] = {
- AMF_VIDEO_CONVERTER_COLOR_PROFILE_601,
- AMF_VIDEO_CONVERTER_COLOR_PROFILE_709,
- AMF_VIDEO_CONVERTER_COLOR_PROFILE_2020,
- };
- const char* pluginToString[] = {
- "601",
- "709",
- "2020",
- };
-
- AMF_RESULT res = m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_COLOR_PROFILE,
- pluginToAMF[(uint8_t)profile]);
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Unable to set Color Profile, error %ls (code %ld).", res);
- m_ColorProfile = profile;
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", pluginToString[(uint8_t)profile]);
-}
-
-Plugin::AMD::H264ColorProfile Plugin::AMD::H264Encoder::GetColorProfile() {
- return m_ColorProfile;
-}
-
-void Plugin::AMD::H264Encoder::SetFullRangeColorEnabled(bool enabled) {
- // Info from Mikhail:
- // - Name may change in the future
- // - Use GetProperty or GetPropertyDescription to test for older or newer drivers.
- const wchar_t* names[] = {
- L"FullRangeColor", // 16.12.1
- L"NominalRange", // 16.11.5 and below.
- };
-
- bool enabledTest;
- AMF_RESULT res = AMF_INVALID_ARG;
- for (size_t i = 0; i < _countof(names); i++) {
- if (m_AMFEncoder->GetProperty(names[i], &enabledTest) == AMF_OK) {
- m_AMFConverter->SetProperty(names[i], enabled);
- res = m_AMFEncoder->SetProperty(names[i], enabled);
- break;
- }
- }
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsFullRangeColorEnabled() {
- // Info from Mikhail:
- // - Name may change in the future
- // - Use GetProperty or GetPropertyDescription to test for older or newer drivers.
- const wchar_t* names[] = {
- L"FullRangeColor", // 16.12.1
- L"NominalRange", // 16.11.5 and below.
- };
-
- bool enabled;
- AMF_RESULT res = AMF_INVALID_ARG;
- for (size_t i = 0; i < _countof(names); i++) {
- res = m_AMFEncoder->GetProperty(names[i], &enabled);
- if (res == AMF_OK) {
- break;
- }
- }
- if (res != AMF_OK)
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetResolution(uint32_t width, uint32_t height) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FRAMESIZE, ::AMFConstructSize(width, height));
- if (res != AMF_OK) {
- std::vector<char> msgBuf(128);
- sprintf(msgBuf.data(), "%dx%d", width, height);
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, msgBuf.data());
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %dx%d.", width, height);
- m_FrameSize.first = width;
- m_FrameSize.second = height;
-
- if (this->GetProfileLevel() == H264ProfileLevel::Automatic)
- this->SetProfileLevel(H264ProfileLevel::Automatic);
-}
-
-std::pair<uint32_t, uint32_t> Plugin::AMD::H264Encoder::GetResolution() {
- AMFSize frameSize;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FRAMESIZE, &frameSize);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %dx%d.", frameSize.width, frameSize.height);
- m_FrameSize.first = frameSize.width;
- m_FrameSize.second = frameSize.height;
-
- if (this->GetProfileLevel() == H264ProfileLevel::Automatic)
- this->SetProfileLevel(H264ProfileLevel::Automatic);
-
- return std::pair<uint32_t, uint32_t>(m_FrameSize);
-}
-
-void Plugin::AMD::H264Encoder::SetFrameRate(uint32_t num, uint32_t den) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FRAMERATE, ::AMFConstructRate(num, den));
- if (res != AMF_OK) {
- std::vector<char> msgBuf;
- sprintf(msgBuf.data(), "%d/%d", num, den);
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, msgBuf.data());
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d/%d.", num, den);
- m_FrameRate.first = num;
- m_FrameRate.second = den;
- m_FrameRateDivisor = (double_t)m_FrameRate.first / (double_t)m_FrameRate.second;
- m_FrameRateReverseDivisor = ((double_t)m_FrameRate.second / (double_t)m_FrameRate.first);
- m_InputQueueLimit = (uint32_t)ceil(m_FrameRateDivisor * 3);
-
- if (this->GetProfileLevel() == H264ProfileLevel::Automatic)
- this->SetProfileLevel(H264ProfileLevel::Automatic);
-}
-
-std::pair<uint32_t, uint32_t> Plugin::AMD::H264Encoder::GetFrameRate() {
- AMFRate frameRate;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FRAMERATE, &frameRate);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d/%d.", frameRate.num, frameRate.den);
- m_FrameRate.first = frameRate.num;
- m_FrameRate.second = frameRate.den;
- m_FrameRateDivisor = (double_t)frameRate.num / (double_t)frameRate.den;
- m_InputQueueLimit = (uint32_t)ceil(m_FrameRateDivisor * 3);
-
- if (this->GetProfileLevel() == H264ProfileLevel::Automatic)
- this->SetProfileLevel(H264ProfileLevel::Automatic);
-
- return std::pair<uint32_t, uint32_t>(m_FrameRate);
-}
-
-void Plugin::AMD::H264Encoder::SetScanType(H264ScanType scanType) {
- static AMF_VIDEO_ENCODER_SCANTYPE_ENUM CustomToAMF[] = {
- AMF_VIDEO_ENCODER_SCANTYPE_PROGRESSIVE,
- AMF_VIDEO_ENCODER_SCANTYPE_INTERLACED,
- };
- static char* CustomToName[] = {
- "Progressive",
- "Interlaced",
- };
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_SCANTYPE, (uint32_t)CustomToAMF[(uint8_t)scanType]);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, CustomToName[(uint8_t)scanType]);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", CustomToName[(uint8_t)scanType]);
-}
-
-Plugin::AMD::H264ScanType Plugin::AMD::H264Encoder::GetScanType() {
- static char* CustomToName[] = {
- "Progressive",
- "Interlaced",
- };
-
- uint32_t scanType;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_SCANTYPE, &scanType);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", CustomToName[scanType]);
- return (Plugin::AMD::H264ScanType)scanType;
-}
-
-void Plugin::AMD::H264Encoder::SetRateControlMethod(H264RateControlMethod method) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD,
- (uint64_t)Utility::RateControlMethodAsAMF(method));
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).",
- res, Utility::RateControlMethodAsString(method));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.",
- Utility::RateControlMethodAsString(method));
-}
-
-Plugin::AMD::H264RateControlMethod Plugin::AMD::H264Encoder::GetRateControlMethod() {
- uint32_t method;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD, &method);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.",
- Utility::RateControlMethodAsString(Utility::RateControlMethodFromAMF(method)));
- return Utility::RateControlMethodFromAMF(method);
-}
-
-void Plugin::AMD::H264Encoder::SetTargetBitrate(uint32_t bitrate) {
- // Clamp Value
- bitrate = clamp(bitrate, 10000,
- Plugin::AMD::VCECapabilities::GetInstance()->GetAdapterCapabilities(m_API, m_APIAdapter, H264EncoderType::AVC).maxBitrate);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_TARGET_BITRATE, bitrate);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d bits failed with error %ls (code %d).", res, bitrate);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d bits.", bitrate);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetTargetBitrate() {
- uint32_t bitrate;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_TARGET_BITRATE, &bitrate);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d bits.", bitrate);
- return bitrate;
-}
-
-void Plugin::AMD::H264Encoder::SetPeakBitrate(uint32_t bitrate) {
- // Clamp Value
- bitrate = clamp(bitrate, 10000,
- Plugin::AMD::VCECapabilities::GetInstance()->GetAdapterCapabilities(m_API, m_APIAdapter, H264EncoderType::AVC).maxBitrate);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PEAK_BITRATE, (uint32_t)bitrate);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d bits failed with error %ls (code %d).", res, bitrate);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d bits.", bitrate);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetPeakBitrate() {
- uint32_t bitrate;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PEAK_BITRATE, &bitrate);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d bits.", bitrate);
- return bitrate;
-}
-
-void Plugin::AMD::H264Encoder::SetMinimumQP(uint8_t qp) {
- // Clamp Value
- qp = clamp(qp, 0, 51);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MIN_QP, (uint32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-uint8_t Plugin::AMD::H264Encoder::GetMinimumQP() {
- uint32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MIN_QP, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (uint8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetMaximumQP(uint8_t qp) {
- // Clamp Value
- qp = clamp(qp, 0, 51);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_QP, (uint32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-uint8_t Plugin::AMD::H264Encoder::GetMaximumQP() {
- uint32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_QP, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (uint8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetIFrameQP(uint8_t qp) {
- // Clamp Value
- qp = clamp(qp, 0, 51);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_I, (uint32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-uint8_t Plugin::AMD::H264Encoder::GetIFrameQP() {
- uint32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_I, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (uint8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetPFrameQP(uint8_t qp) {
- // Clamp Value
- qp = clamp(qp, 0, 51);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_P, (uint32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-uint8_t Plugin::AMD::H264Encoder::GetPFrameQP() {
- uint32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_P, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (uint8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetBFrameQP(uint8_t qp) {
- // Clamp Value
- qp = clamp(qp, 0, 51);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_B, (uint32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-uint8_t Plugin::AMD::H264Encoder::GetBFrameQP() {
- uint32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_B, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (uint8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetVBVBufferSize(uint32_t size) {
- // Clamp Value
- size = clamp(size, 1000, 100000000); // 1kbit to 100mbit.
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_VBV_BUFFER_SIZE, (uint32_t)size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d bits failed with error %ls (code %d).", res, size);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d bits.", size);
-}
-
-void Plugin::AMD::H264Encoder::SetVBVBufferAutomatic(double_t strictness) {
- uint32_t strictBitrate = 1000, looseBitrate = 100000000;
-
- // Strict VBV Buffer Size = Bitrate / FPS
- // Loose VBV Buffer Size = Bitrate
-
- if (GetUsage() == H264Usage::UltraLowLatency) {
- looseBitrate = GetTargetBitrate();
- } else {
- switch (this->GetRateControlMethod()) {
- case H264RateControlMethod::ConstantBitrate:
- case H264RateControlMethod::VariableBitrate_LatencyConstrained:
- looseBitrate = this->GetTargetBitrate();
- break;
- case H264RateControlMethod::VariableBitrate_PeakConstrained:
- looseBitrate = max(this->GetTargetBitrate(), this->GetPeakBitrate());
- break;
- case H264RateControlMethod::ConstantQP:
- // When using Constant QP, one will have to pick a QP that is decent
- // in both quality and bitrate. We can easily calculate both the QP
- // required for an average bitrate and the average bitrate itself
- // with these formulas:
- // BITRATE = ((1 - (QP / 51)) ^ 2) * ((Width * Height) * 1.5 * (FPSNumerator / FPSDenumerator))
- // QP = (1 - sqrt(BITRATE / ((Width * Height) * 1.5 * (FPSNumerator / FPSDenumerator)))) * 51
-
- auto frameSize = this->GetResolution();
- auto frameRate = this->GetFrameRate();
-
- double_t bitrate = frameSize.first * frameSize.second;
- switch (this->m_ColorFormat) {
- case H264ColorFormat::NV12:
- case H264ColorFormat::I420:
- bitrate *= 1.5;
- break;
- case H264ColorFormat::YUY2:
- bitrate *= 4;
- break;
- case H264ColorFormat::BGRA:
- case H264ColorFormat::RGBA:
- bitrate *= 3;
- break;
- case H264ColorFormat::GRAY:
- bitrate *= 1;
- break;
- }
- bitrate *= frameRate.first / frameRate.second;
-
- uint8_t qp_i, qp_p, qp_b;
- qp_i = this->GetIFrameQP();
- qp_p = this->GetPFrameQP();
- try { qp_b = this->GetBFrameQP(); } catch (...) { qp_b = 51; }
- double_t qp = 1 - ((double_t)(min(min(qp_i, qp_p), qp_b)) / 51.0);
- qp = max(qp * qp, 0.001); // Needs to be at least 0.001.
-
- looseBitrate = static_cast<uint32_t>(bitrate * qp);
- break;
- }
- }
- strictBitrate = static_cast<uint32_t>(looseBitrate * m_FrameRateReverseDivisor);
-
- // 0% = 100000, 50% = looseBitrate, 100% = strictBitrate
- strictness = min(max(strictness, 0.0), 1.0);
- double_t aAB = min(strictness * 2.0, 1.0f);
- double_t bAB = max(strictness * 2.0 - 1.0, 0.0);
-
- double_t aFade = (looseBitrate * aAB) + (100000 * (1.0 - aAB));
- double_t bFade = (strictness * bAB) + (aFade * (1.0 - bAB));
-
- uint32_t vbvBufferSize = static_cast<uint32_t>(round(bFade));
- this->SetVBVBufferSize(vbvBufferSize);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetVBVBufferSize() {
- uint32_t size;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_VBV_BUFFER_SIZE, &size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", size);
- return size;
-}
-
-void Plugin::AMD::H264Encoder::SetInitialVBVBufferFullness(double_t fullness) {
- // Clamp Value
- fullness = max(min(fullness, 1), 0); // 0 to 100 %
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_INITIAL_VBV_BUFFER_FULLNESS, (uint32_t)(fullness * 64));
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %f%% failed with error %ls (code %d).", res, fullness * 100);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %f%%.", fullness * 100);
-}
-
-double_t Plugin::AMD::H264Encoder::GetInitialVBVBufferFullness() {
- uint32_t vbvBufferFullness;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_INITIAL_VBV_BUFFER_FULLNESS, &vbvBufferFullness);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %f%%.", vbvBufferFullness / 64.0 * 100.0);
- return ((double_t)vbvBufferFullness / 64.0);
-}
-
-void Plugin::AMD::H264Encoder::SetFillerDataEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FILLER_DATA_ENABLE, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsFillerDataEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FILLER_DATA_ENABLE, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetFrameSkippingEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_SKIP_FRAME_ENABLE, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsFrameSkippingEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_SKIP_FRAME_ENABLE, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetEnforceHRDRestrictionsEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_ENFORCE_HRD, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsEnforceHRDRestrictionsEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_ENFORCE_HRD, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetIDRPeriod(uint32_t period) {
- // Clamp Value
- period = max(min(period, 1000), 1); // 1-1000 so that OBS can actually quit.
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_IDR_PERIOD, (uint32_t)period);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, period);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", period);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetIDRPeriod() {
- int32_t period;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_IDR_PERIOD, &period);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", period);
- return period;
-}
-
-void Plugin::AMD::H264Encoder::SetBFramePattern(H264BFramePattern pattern) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_B_PIC_PATTERN, (uint32_t)pattern);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, pattern);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", pattern);
-}
-
-Plugin::AMD::H264BFramePattern Plugin::AMD::H264Encoder::GetBFramePattern() {
- uint32_t pattern;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_PIC_PATTERN, &pattern);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", pattern);
- return (Plugin::AMD::H264BFramePattern)pattern;
-}
-
-void Plugin::AMD::H264Encoder::SetBFrameDeltaQP(int8_t qp) {
- // Clamp Value
- qp = clamp(qp, -10, 10);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_B_PIC_DELTA_QP, (int32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-int8_t Plugin::AMD::H264Encoder::GetBFrameDeltaQP() {
- int32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_PIC_DELTA_QP, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (int8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetBFrameReferenceEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_B_REFERENCE_ENABLE, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsBFrameReferenceEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_REFERENCE_ENABLE, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetBFrameReferenceDeltaQP(int8_t qp) {
- // Clamp Value
- qp = clamp(qp, -10, 10);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_REF_B_PIC_DELTA_QP, (int32_t)qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, qp);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", qp);
-}
-
-int8_t Plugin::AMD::H264Encoder::GetBFrameReferenceDeltaQP() {
- int32_t qp;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_REF_B_PIC_DELTA_QP, &qp);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", qp);
- return (int8_t)qp;
-}
-
-void Plugin::AMD::H264Encoder::SetDeblockingFilterEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_DE_BLOCKING_FILTER, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsDeblockingFilterEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_DE_BLOCKING_FILTER, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetHalfPixelMotionEstimationEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MOTION_HALF_PIXEL, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsHalfPixelMotionEstimationEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MOTION_HALF_PIXEL, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetQuarterPixelMotionEstimationEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MOTION_QUARTERPIXEL, enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsQuarterPixelMotionEstimationEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MOTION_QUARTERPIXEL, &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-//////////////////////////////////////////////////////////////////////////
-// EXPERIMENTAL PROPERTIES - MAY BREAK AT ANY POINT IN TIME!
-//////////////////////////////////////////////////////////////////////////
-// Their effect may vary from driver to driver, card to card.
-
-uint32_t Plugin::AMD::H264Encoder::GetMaxMBPerSec() {
- uint32_t maxMBPerSec;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"MaxMBPerSec", &maxMBPerSec);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", maxMBPerSec);
- return maxMBPerSec;
-}
-
-void Plugin::AMD::H264Encoder::SetHeaderInsertionSpacing(uint32_t spacing) {
- // Clamp Value
- spacing = max(min(spacing, m_FrameRate.second * 1000), 0);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEADER_INSERTION_SPACING, (uint32_t)spacing);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, spacing);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", spacing);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetHeaderInsertionSpacing() {
- int32_t headerInsertionSpacing;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEADER_INSERTION_SPACING, &headerInsertionSpacing);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", headerInsertionSpacing);
- return headerInsertionSpacing;
-}
-
-void Plugin::AMD::H264Encoder::SetMaximumLongTermReferenceFrames(uint32_t maximumLTRFrames) {
- // Clamp Parameter Value
- maximumLTRFrames = max(min(maximumLTRFrames, 2), 0);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_LTR_FRAMES, (uint32_t)maximumLTRFrames);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, maximumLTRFrames);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", maximumLTRFrames);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetMaximumLongTermReferenceFrames() {
- uint32_t maximumLTRFrames;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_LTR_FRAMES, &maximumLTRFrames);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", maximumLTRFrames);
- return maximumLTRFrames;
-}
-
-void Plugin::AMD::H264Encoder::SetCodingType(H264CodingType type) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"CABACEnable", (size_t)type);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, Utility::CodingTypeAsString(type));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", Utility::CodingTypeAsString(type));
-}
-
-H264CodingType Plugin::AMD::H264Encoder::GetCodingType() {
- uint64_t type;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"CABACEnable", &type);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", Utility::CodingTypeAsString((H264CodingType)type));
- return (H264CodingType)type;
-}
-
-void Plugin::AMD::H264Encoder::SetMaximumAccessUnitSize(uint32_t size) {
- // Clamp Value
- size = max(min(size, 100000000), 0); // 1kbit to 100mbit.
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_AU_SIZE, (uint32_t)size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d bits failed with error %ls (code %d).", res, size);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d bits.", size);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetMaximumAccessUnitSize() {
- uint32_t size;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_AU_SIZE, &size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", size);
- return size;
-}
-
-void Plugin::AMD::H264Encoder::SetWaitForTaskEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"WaitForTask", enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsWaitForTaskEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"WaitForTask", &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetPreAnalysisPassEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"RateControlPreanalysisEnable", enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsPreAnalysisPassEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"RateControlPreanalysisEnable", &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetVBAQEnabled(bool enabled) {
- const wchar_t* names[] = {
- L"EnableVBAQ", // 16.12.1
- L"EanbleVBAQ", // 16.11.5 and below.
- };
-
- bool enabledTest;
- AMF_RESULT res = AMF_INVALID_ARG;
- for (size_t i = 0; i < _countof(names); i++) {
- if (m_AMFEncoder->GetProperty(names[i], &enabledTest) == AMF_OK) {
- m_AMFConverter->SetProperty(names[i], enabled);
- res = m_AMFEncoder->SetProperty(names[i], enabled);
- break;
- }
- }
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsVBAQEnabled() {
- const wchar_t* names[] = {
- L"EnableVBAQ", // 16.12.1
- L"EanbleVBAQ", // 16.11.5 and below.
- };
-
- bool enabled;
- AMF_RESULT res = AMF_INVALID_ARG;
- for (size_t i = 0; i < _countof(names); i++) {
- res = m_AMFEncoder->GetProperty(names[i], &enabled);
- if (res == AMF_OK) {
- break;
- }
- }
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetGOPSize(uint32_t size) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"GOPSize", (uint32_t)size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, size);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", size);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetGOPSize() {
- uint32_t size;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"GOPSize", &size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", size);
- return size;
-}
-
-void Plugin::AMD::H264Encoder::SetGOPAlignmentEnabled(bool enabled) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"EnableGOPAlignment", enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, enabled ? "Enabled" : "Disabled");
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", enabled ? "Enabled" : "Disabled");
-}
-
-bool Plugin::AMD::H264Encoder::IsGOPAlignementEnabled() {
- bool enabled;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"EnableGOPAlignment", &enabled);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", enabled ? "Enabled" : "Disabled");
- return enabled;
-}
-
-void Plugin::AMD::H264Encoder::SetMaximumReferenceFrames(uint32_t numFrames) {
- auto caps = VCECapabilities::GetInstance()->GetAdapterCapabilities(m_API, m_APIAdapter, H264EncoderType::AVC);
- numFrames = clamp(numFrames,
- caps.minReferenceFrames,
- caps.maxReferenceFrames);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"MaxNumRefFrames", (uint32_t)numFrames);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, numFrames);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", numFrames);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetMaximumReferenceFrames() {
- uint32_t numFrames;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"MaxNumRefFrames", &numFrames);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", numFrames);
- return numFrames;
-}
-
-void Plugin::AMD::H264Encoder::SetAspectRatio(uint32_t num, uint32_t den) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"AspectRatio", ::AMFConstructRate(num, den));
- if (res != AMF_OK) {
- std::vector<char> msgBuf;
- sprintf(msgBuf.data(), "%d:%d", num, den);
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, msgBuf.data());
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d:%d.", num, den);
-}
-
-std::pair<uint32_t, uint32_t> Plugin::AMD::H264Encoder::GetAspectRatio() {
- AMFRate aspectRatio;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"AspectRatio", &aspectRatio);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d:%d.", aspectRatio.num, aspectRatio.den);
- return std::pair<uint32_t, uint32_t>(aspectRatio.num, aspectRatio.den);
-}
-
-void Plugin::AMD::H264Encoder::SetIntraRefreshMacroblocksPerSlot(uint32_t mbs) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_INTRA_REFRESH_NUM_MBS_PER_SLOT, (uint32_t)mbs);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, mbs);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", mbs);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetIntraRefreshMacroblocksPerSlot() {
- int32_t mbs;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_INTRA_REFRESH_NUM_MBS_PER_SLOT, &mbs);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", mbs);
- return mbs;
-}
-
-void Plugin::AMD::H264Encoder::SetIntraRefreshNumberOfStripes(uint32_t stripes) {
- stripes = clamp(stripes, 0, INT_MAX);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"IntraRefreshNumOfStripes", (uint32_t)stripes);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, stripes);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", stripes);
-
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetIntraRefreshNumberOfStripes() {
- uint32_t stripes;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"IntraRefreshNumOfStripes", &stripes);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", stripes);
- return stripes;
-}
-
-void Plugin::AMD::H264Encoder::SetSlicesPerFrame(uint32_t slices) {
- slices = max(slices, 1);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_SLICES_PER_FRAME, (uint32_t)slices);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, slices);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", slices);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetSlicesPerFrame() {
- uint32_t slices;
- AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_SLICES_PER_FRAME, &slices);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", slices);
- return slices;
-}
-
-void Plugin::AMD::H264Encoder::SetSliceMode(H264SliceMode mode) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceMode", (uint32_t)mode);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, Utility::SliceModeAsString(mode));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", Utility::SliceModeAsString(mode));
-}
-
-Plugin::AMD::H264SliceMode Plugin::AMD::H264Encoder::GetSliceMode() {
- uint32_t mode;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceMode", &mode);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", Utility::SliceModeAsString((H264SliceMode)mode));
- return (H264SliceMode)mode;
-}
-
-void Plugin::AMD::H264Encoder::SetMaximumSliceSize(uint32_t size) {
- size = clamp(size, 1, INT_MAX);
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"MaxSliceSize", (uint32_t)size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, size);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", size);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetMaximumSliceSize() {
- uint32_t size;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"MaxSliceSize", &size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", size);
- return size;
-}
-
-void Plugin::AMD::H264Encoder::SetSliceControlMode(H264SliceControlMode mode) {
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlMode", (uint32_t)mode);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %s failed with error %ls (code %d).", res, Utility::SliceControlModeAsString(mode));
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %s.", Utility::SliceControlModeAsString(mode));
-}
-
-Plugin::AMD::H264SliceControlMode Plugin::AMD::H264Encoder::GetSliceControlMode() {
- uint32_t mode;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlMode", &mode);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %s.", Utility::SliceControlModeAsString((H264SliceControlMode)mode));
- return (H264SliceControlMode)mode;
-}
-
-void Plugin::AMD::H264Encoder::SetSliceControlSize(uint32_t size) {
- // If GetSliceMode() is VCESliceMode_Vertical, then it outputs nothing with the following settings:
- // - SliceControlMode: VCESliceControlMode_Macroblock
- // - SliceControlSize: < 3600
- // If GetSliceMode() is VCESliceMode_Horizontal, then it outputs nothing with the following settings:
- // - SliceControlMode: VCESliceControlMode_Macroblock
- // - SliceControlSize: < 32
-
- // H264 Macroblock = 16*16 = 256
- switch (GetSliceControlMode()) {
- case H264SliceControlMode::Off:
- return;
- case H264SliceControlMode::Macroblock:
- size = clamp(size, 0, (uint32_t)(ceil(m_FrameSize.first / 16) * ceil(m_FrameSize.second / 16)));
- break;
- case H264SliceControlMode::Macroblock_Row:
- size = clamp(size, 0, (uint32_t)ceil(m_FrameSize.second / 16));
- break;
- }
-
- AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlSize", (uint32_t)size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Setting to %d failed with error %ls (code %d).", res, size);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Set to %d.", size);
-}
-
-uint32_t Plugin::AMD::H264Encoder::GetSliceControlSize() {
- uint32_t size;
- AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlSize", &size);
- if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Failed with error %ls (code %d).", res);
- }
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Value is %d.", size);
- return size;
-}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/misc-util.cpp
Deleted
-/*
-MIT License
-
-Copyright (c) 2016 Michael Fabian Dirks
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
-*/
-
-#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-
-// Plugin
-#include "plugin.h"
-#include "amf.h"
-#include "amf-capabilities.h"
-#include "amf-h264.h"
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
-using namespace Plugin::AMD;
-
-namespace Plugin {
- namespace Utility {
- H264ProfileLevel inline GetMinimumProfileLevel(std::pair<uint32_t, uint32_t> frameSize, std::pair<uint32_t, uint32_t> frameRate) {
- typedef std::pair<uint32_t, uint32_t> levelRestriction;
- typedef std::pair<H264ProfileLevel, levelRestriction> level;
-
- static const level profileLevelLimit[] = { // [Level, [Samples, Samples_Per_Sec]]
- level(H264ProfileLevel::L10, levelRestriction(25344, 380160)),
- level(H264ProfileLevel::L11, levelRestriction(101376, 768000)),
- level(H264ProfileLevel::L12, levelRestriction(101376, 1536000)),
- level(H264ProfileLevel::L13, levelRestriction(101376, 3041280)),
- level(H264ProfileLevel::L20, levelRestriction(101376, 3041280)),
- level(H264ProfileLevel::L21, levelRestriction(202752, 5068800)),
- level(H264ProfileLevel::L22, levelRestriction(414720, 5184000)),
- level(H264ProfileLevel::L30, levelRestriction(414720, 10368000)),
- level(H264ProfileLevel::L31, levelRestriction(921600, 27648000)),
- level(H264ProfileLevel::L32, levelRestriction(1310720, 55296000)),
- //level(H264ProfileLevel::40, levelRestriction(2097152, 62914560)), // Technically identical to 4.1, but backwards compatible.
- level(H264ProfileLevel::L41, levelRestriction(2097152, 62914560)),
- level(H264ProfileLevel::L42, levelRestriction(2228224, 133693440)),
- level(H264ProfileLevel::L50, levelRestriction(5652480, 150994944)),
- level(H264ProfileLevel::L51, levelRestriction(9437184, 251658240)),
- level(H264ProfileLevel::L52, levelRestriction(9437184, 530841600)),
- level((H264ProfileLevel)-1, levelRestriction(0, 0))
- };
-
- uint32_t samples = frameSize.first * frameSize.second;
- uint32_t samples_sec = (uint32_t)ceil((double_t)samples * ((double_t)frameRate.first / (double_t)frameRate.second));
-
- level curLevel = profileLevelLimit[0];
- for (uint32_t index = 0; (int32_t)curLevel.first != -1; index++) {
- curLevel = profileLevelLimit[index];
-
- if (samples > curLevel.second.first)
- continue;
-
- if (samples_sec > curLevel.second.second)
- continue;
-
- return curLevel.first;
- }
- return H264ProfileLevel::L52;
- }
-
- #pragma region VCEEncoderType
- inline const char* VCEEncoderTypeAsString(H264EncoderType type) {
- const char* types[] = {
- "AVC",
- "SVC",
- "HEVC"
- };
- return types[(uint8_t)type];
- }
- inline const wchar_t* VCEEncoderTypeAsAMF(H264EncoderType type) {
- const wchar_t* types[] = {
- AMFVideoEncoderVCE_AVC,
- AMFVideoEncoderVCE_SVC,
- L"AMFVideoEncoderHW_HEVC"
- };
- return types[(uint8_t)type];
- }
- #pragma endregion VCEEncoderType
- #pragma region VCEMemoryType
- inline const char* MemoryTypeAsString(H264MemoryType memoryType) {
- static const char* memoryTypeToString[] = {
- "Host",
- "DirectX9",
- "DirectX11",
- "OpenGL"
- };
- return memoryTypeToString[(uint8_t)memoryType];
- }
- inline amf::AMF_MEMORY_TYPE MemoryTypeAsAMF(H264MemoryType memoryType) {
- static amf::AMF_MEMORY_TYPE memoryTypeToAMF[] = {
- amf::AMF_MEMORY_HOST,
- amf::AMF_MEMORY_DX9,
- amf::AMF_MEMORY_DX11,
- amf::AMF_MEMORY_OPENGL,
- };
- return memoryTypeToAMF[(uint8_t)memoryType];
- }
- #pragma endregion VCEMemoryType
- #pragma region VCESurfaceFormat
- inline const char* SurfaceFormatAsString(H264ColorFormat surfaceFormat) {
- static const char* surfaceFormatToString[] = {
- "NV12",
- "I420",
- "YUY2",
- "BGRA",
- "RGBA",
- "GRAY",
- };
- return surfaceFormatToString[(uint8_t)surfaceFormat];
- }
- inline amf::AMF_SURFACE_FORMAT SurfaceFormatAsAMF(H264ColorFormat surfaceFormat) {
- static amf::AMF_SURFACE_FORMAT surfaceFormatToAMF[] = {
- // 4:2:0 Formats
- amf::AMF_SURFACE_NV12,
- amf::AMF_SURFACE_YUV420P,
- // 4:2:2 Formats
- amf::AMF_SURFACE_YUY2,
- // Uncompressed
- amf::AMF_SURFACE_BGRA,
- amf::AMF_SURFACE_RGBA,
- // Other
- amf::AMF_SURFACE_GRAY8,
- };
- return surfaceFormatToAMF[(uint8_t)surfaceFormat];
- }
- #pragma endregion VCESurfaceFormat
- #pragma region VCEUsage
- inline const char* UsageAsString(H264Usage usage) {
- static const char* usageToString[] = {
- "Transcoding",
- "Ultra Low Latency",
- "Low Latency",
- "Webcam"
- };
- return usageToString[(uint8_t)usage];
- }
- inline AMF_VIDEO_ENCODER_USAGE_ENUM UsageAsAMF(H264Usage usage) {
- static AMF_VIDEO_ENCODER_USAGE_ENUM usageToAMF[] = {
- AMF_VIDEO_ENCODER_USAGE_TRANSCONDING,
- AMF_VIDEO_ENCODER_USAGE_ULTRA_LOW_LATENCY,
- AMF_VIDEO_ENCODER_USAGE_LOW_LATENCY,
- AMF_VIDEO_ENCODER_USAGE_WEBCAM,
- };
- return usageToAMF[(uint8_t)usage];
- }
- inline H264Usage UsageFromAMF(uint32_t usage) {
- static H264Usage usageFromAMF[] = {
- H264Usage::Transcoding,
- H264Usage::UltraLowLatency,
- H264Usage::LowLatency,
- H264Usage::Webcam,
- };
- return usageFromAMF[(uint8_t)usage];
- }
- #pragma endregion VCEUsage
- #pragma region VCEQualityPreset
- inline const char* QualityPresetAsString(H264QualityPreset preset) {
- static const char* qualityPresetToString[] = {
- "Speed",
- "Balanced",
- "Quality"
- };
- return qualityPresetToString[(uint8_t)preset];
- }
- #pragma endregion VCEQualityPreset
- #pragma region VCEProfile
- inline const char* ProfileAsString(H264Profile profile) {
- switch (profile) {
- case H264Profile::Baseline:
- return "Baseline";
- case H264Profile::Main:
- return "Main";
- case H264Profile::High:
- return "High";
- case H264Profile::ConstrainedBaseline:
- return "Constrained Baseline";
- case H264Profile::ConstrainedHigh:
- return "Constrained High";
- }
-
- return "Invalid";
- }
- #pragma endregion VCEProfile
- #pragma region VCERateControlMethod
- inline const char* RateControlMethodAsString(H264RateControlMethod method) {
- static const char* rateControlMethodToString[] = {
- "Constant Quantization Parameter (CQP)",
- "Constant Bitrate (CBR)",
- "Peak Constrained Variable Bitrate (VBR)",
- "Latency Constrained Variable Bitrate (VBR_LAT)"
- };
- return rateControlMethodToString[(uint8_t)method];
- }
- inline AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM RateControlMethodAsAMF(H264RateControlMethod method) {
- static AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM CustomToAMF[] = {
- AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CONSTANT_QP,
- AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CBR,
- AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_PEAK_CONSTRAINED_VBR,
- AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_LATENCY_CONSTRAINED_VBR,
- };
- return CustomToAMF[(uint8_t)method];
- }
- inline H264RateControlMethod RateControlMethodFromAMF(uint32_t method) {
- static H264RateControlMethod AMFToCustom[] = {
- H264RateControlMethod::ConstantQP,
- H264RateControlMethod::ConstantBitrate,
- H264RateControlMethod::VariableBitrate_PeakConstrained,
- H264RateControlMethod::VariableBitrate_LatencyConstrained,
- };
- return AMFToCustom[(uint8_t)method];
- }
- #pragma endregion VCERateControlMethod
-
- inline const char* CodingTypeAsString(H264CodingType type) {
- switch (type) {
- case H264CodingType::CABAC:
- return "CABAC";
- case H264CodingType::CALVC:
- return "CALVC";
- case H264CodingType::Default:
- return "Default";
- }
- return "MEMORY CORRUPTION";
- }
- inline const char* SliceModeAsString(H264SliceMode mode) {
- switch (mode) {
- case H264SliceMode::Horizontal:
- return "Horizontal";
- case H264SliceMode::Vertical:
- return "Vertical";
- }
- return "MEMORY CORRUPTION";
- }
- inline const char* SliceControlModeAsString(H264SliceControlMode mode) {
- switch (mode) {
- case H264SliceControlMode::Off:
- return "Off";
- case H264SliceControlMode::Macroblock:
- return "Macroblock";
- case H264SliceControlMode::Macroblock_Row:
- return "Macroblock Row";
- }
- return "MEMORY CORRUPTION";
- }
- }
-}
\ No newline at end of file
obs-studio-18.0.2.tar.xz/.gitattributes
Added
+* text=auto
+
+*.sln text eol=crlf
+*.vcproj text eol=crlf
+*.vcxproj text eol=crlf
+*.vcxproj text eol=crlf
+*.vcxproj.filters text eol=crlf
+
+cmake/ALL_BUILD.vcxproj.user.in text eol=crlf
+
obs-studio-18.0.2.tar.xz/.gitmodules
Added
+[submodule "plugins/win-dshow/libdshowcapture"]
+ path = plugins/win-dshow/libdshowcapture
+ url = https://github.com/jp9000/libdshowcapture.git
+[submodule "plugins/mac-syphon/syphon-framework"]
+ path = plugins/mac-syphon/syphon-framework
+ url = https://github.com/palana/Syphon-Framework.git
+[submodule "plugins/enc-amf"]
+ path = plugins/enc-amf
+ url = https://github.com/Xaymar/obs-studio_amf-encoder-plugin.git
+[submodule "plugins/obs-browser"]
+ path = plugins/obs-browser
+ url = https://github.com/kc5nra/obs-browser.git
+[submodule "plugins/obs-vst"]
+ path = plugins/obs-vst
+ url = https://github.com/DDRBoxman/obs-vst.git
obs-studio-18.0.1.tar.xz/.travis.yml -> obs-studio-18.0.2.tar.xz/.travis.yml
Changed
language: cpp
+cache:
+ ccache: true
+
env:
global:
# AWS S3 creds
# secret
- secure: "JRQVU2zgC3hY6CEY+Crmh/upp93En0BzKaLcsuBT538johNlK7m5hn3m2UOw63seLvBvVaKKWUDj9N986a3DwcXxWPMyF/9ctXgNWy39WzaVWxrbVR5nQB1fdiRp5YEgkoVN+gEm3OVF7sV5AGzh5/8CvEdRCoTLIGgMGHxW9mc="
+ # ccache
+ - USE_CCACHE=1
+ - CCACHE_COMPRESS=1
+ - CCACHE_MAXSIZE=200M
+ - CCACHE_CPP2=1
+
matrix:
include:
- os: osx
all_branches: true
# The channel name "azubu.il.us.quakenet.org#obs-dev" is encrypted against jp9000/obs-studio to prevent IRC spam of forks
+#notifications:
+# irc:
+# skip_join: false
+# template:
+# - "[Travis CI|%{result}] %{repository_name}/%{branch} (%{author} - %{commit_subject}) %{build_url}"
+# channels:
+# - secure: k9j7+ogVODMlveZdd5pP73AVLCFl1VbzVaVon0ECn3EQcxnLSpiZbc6l+PnIUKgee5pRKtUB4breufgmr4puq3s69YeQiOVKk5gx2yJGZ5jGacbSne0xTspzPxapiEbVUkcJ2L7gKntDG4+SUiW67dtt4G26O7zsErDF/lY/woQ=
+# on_failure: always
+# on_success: change
notifications:
- irc:
- skip_join: false
- template:
- - "[Travis CI|%{result}] %{repository_name}/%{branch} (%{author} - %{commit_subject}) %{build_url}"
- channels:
- - secure: k9j7+ogVODMlveZdd5pP73AVLCFl1VbzVaVon0ECn3EQcxnLSpiZbc6l+PnIUKgee5pRKtUB4breufgmr4puq3s69YeQiOVKk5gx2yJGZ5jGacbSne0xTspzPxapiEbVUkcJ2L7gKntDG4+SUiW67dtt4G26O7zsErDF/lY/woQ=
- on_failure: always
+ webhooks:
+ urls:
+ - secure: T5RBY818nO40nr5eC8pdrCfAdQKGkjQdbyYw7mfFrhxWxgt/U5tyKXpX0l9zNGfobS0SnLSqF71OrfW04V97oijXx3q5Y24xV6mSrlLQZOq19+XvGp82LDpkVd4yi2N0kBYpoANB9Pkof4jWT/rKfdQCQttluOLjgr5SM0uWHRg=
on_success: change
+ on_failure: always
+
obs-studio-18.0.1.tar.xz/CI/before-script-linux.sh -> obs-studio-18.0.2.tar.xz/CI/before-script-linux.sh
Changed
#!/bin/sh
set -ex
+ccache -s || echo "CCache is not available."
mkdir build && cd build
cmake ..
obs-studio-18.0.1.tar.xz/CI/before-script-osx.sh -> obs-studio-18.0.2.tar.xz/CI/before-script-osx.sh
Changed
+# Make sure ccache is found
+export PATH=/usr/local/opt/ccache/libexec:$PATH
+
mkdir build
cd build
cmake -DENABLE_SPARKLE_UPDATER=ON -DCMAKE_OSX_DEPLOYMENT_TARGET=10.9 -DDepsPath=/tmp/obsdeps -DVLCPath=$PWD/../../vlc-master -DBUILD_BROWSER=ON -DCEF_ROOT_DIR=$PWD/../../cef_binary_${CEF_BUILD_VERSION}_macosx64 ..
obs-studio-18.0.1.tar.xz/CI/install-dependencies-linux.sh -> obs-studio-18.0.2.tar.xz/CI/install-dependencies-linux.sh
Changed
#!/bin/sh
set -ex
-sudo add-apt-repository ppa:kirillshkrogalev/ffmpeg-next -y
sudo apt-get -qq update
sudo apt-get install -y \
build-essential \
checkinstall \
cmake \
libasound2-dev \
- libavcodec-ffmpeg-dev \
- libavdevice-ffmpeg-dev \
- libavfilter-ffmpeg-dev \
- libavformat-ffmpeg-dev \
- libavutil-ffmpeg-dev \
libcurl4-openssl-dev \
+ libfdk-aac-dev \
libfontconfig-dev \
libfreetype6-dev \
libgl1-mesa-dev \
libpulse-dev \
libqt5x11extras5-dev \
libspeexdsp-dev \
- libswresample-ffmpeg-dev \
- libswscale-ffmpeg-dev \
libudev-dev \
libv4l-dev \
libvlc-dev \
libxcomposite-dev \
libxinerama-dev \
pkg-config \
- qtbase5-dev
+ qtbase5-dev \
+ yasm \
+ zlib1g-dev
+
+# FFmpeg
+cd ..
+git clone --depth 1 git://source.ffmpeg.org/ffmpeg.git
+cd ffmpeg
+./configure --enable-shared
+make -j2
+sudo make install
obs-studio-18.0.1.tar.xz/CI/install-dependencies-osx.sh -> obs-studio-18.0.2.tar.xz/CI/install-dependencies-osx.sh
Changed
brew update
-#Base OBS Deps
-brew install qt5 jack speexdsp
+#Base OBS Deps and ccache
+brew install qt5 jack speexdsp ccache
+
+export PATH=/usr/local/opt/ccache/libexec:$PATH
+ccache -s || echo "CCache is not available."
# Fetch and untar prebuilt OBS deps that are compatible with older versions of OSX
curl -L -O https://s3-us-west-2.amazonaws.com/obs-nightly/osx-deps.tar.gz -f --retry 5 -C -
obs-studio-18.0.2.tar.xz/CONTRIBUTING.rst
Added
+Contributing
+============
+
+Quick Links for Contributing
+----------------------------
+
+ - Compiling and building OBS Studio:
+ https://github.com/jp9000/obs-studio/wiki/Install-Instructions
+
+ - Our bug tracker (linked to forum accounts):
+ https://obsproject.com/mantis/
+
+ - Development IRC channel: #obs-dev on QuakeNet
+
+ - Development forum:
+ https://obsproject.com/forum/list/general-development.21/
+
+ - To contribute language translations, do not make pull requests.
+ Instead, use crowdin. Read here for more information:
+ https://obsproject.com/forum/threads/how-to-contribute-translations-for-obs.16327/
+
+Coding Guidelines
+-----------------
+
+ - OBS Studio uses kernel normal form (linux variant), for more
+ information, please read here:
+ https://github.com/torvalds/linux/blob/master/Documentation/process/coding-style.rst
+
+ - Avoid trailing spaces. To view trailing spaces before making a
+ commit, use "git diff" on your changes. If colors are enabled for
+ git in the command prompt, it will show you any whitespace issues
+ marked with red.
+
+ - Tabs for indentation, spaces for alignment. Tabs are treated as 8
+ columns wide.
+
+ - 80 columns max
+
+Commit Guidlines
+----------------
+
+ - OBS Studio uses the 50/72 standard for commits. 50 characters max
+ for the title (excluding module prefix), an empty line, and then a
+ full description of the commit, wrapped to 72 columns max. See this
+ link for more information: http://chris.beams.io/posts/git-commit/
+
+ - Make sure commit titles are always in present tense, and are not
+ followed by punctuation.
+
+ - Prefix commit titles with the module name, followed by a colon and a
+ space (unless modifying a file in the base directory). When
+ modifying cmake modules, prefix with "cmake". So for example, if you
+ are modifying the obs-ffmpeg plugin::
+
+ obs-ffmpeg: Fix bug with audio output
+
+ Or for libobs::
+
+ libobs: Fix source not displaying
+
+ - If you still need examples, please view the commit history.
+
+Headers
+-------
+
+ There's no formal documentation as of yet, so it's recommended to read
+ the headers (which are heavily commented) to learn the API.
+
+ Here are the most important headers to check out::
+
+ libobs/obs.h Main header
+
+ libobs/obs-module.h Main header for plugin modules
+
+ libobs/obs-source.h Creating video/audio sources
+
+ libobs/obs-output.h Creating outputs
+
+ libobs/obs-encoder.h Implementing encoders
+
+ libobs/obs-service.h Implementing custom streaming services
+
+ libobs/graphics/graphics.h Graphics API
+
+ UI/obs-frontend-api/obs-frontend-api.h
+ Front-end API
+
+ If you would like to learn from example, examine the default plugins
+ (in the <plugins> subdirectory). All features of OBS Studio are
+ implemented as plugins.
obs-studio-18.0.2.tar.xz/README.rst
Added
+OBS Studio <https://obsproject.com>
+===================================
+
+What is OBS Studio?
+-------------------
+
+ OBS Studio is software designed for capturing, compositing, encoding,
+ recording, and streaming video content, efficiently.
+
+ It's distributed under the GNU General Public License v2 - see the
+ accompanying COPYING file for more details.
+
+Quick Links
+-----------
+
+ - Website: https://obsproject.com
+
+ - Help/Guides: https://github.com/jp9000/obs-studio/wiki
+
+ - Forums: https://obsproject.com/forum/
+
+ - Build Instructions: https://github.com/jp9000/obs-studio/wiki/Install-Instructions
+
+ - Bug Tracker: https://obsproject.com/mantis/
+
+ (Note: The bug tracker is linked to forum accounts. To use the bug
+ tracker, log in to a forum account)
+
+Contributing
+------------
+
+ - If you wish to contribute code to the project, please make sure read
+ the coding and commit guidelines:
+ https://github.com/jp9000/obs-studio/blob/master/CONTRIBUTING.rst
+
+ - If you wish to contribute translations, do not submit pull requests.
+ Instead, please use Crowdin. For more information read this thread:
+ https://obsproject.com/forum/threads/how-to-contribute-translations-for-obs.16327/
+
+ - Other ways to contribute are by helping people out with support on
+ our forums or in our community chat. Please limit support to topics
+ you fully understand -- bad advice is worse than no advice. When it
+ comes to something that you don't fully know or understand, please
+ defer to the official help or official channels.
obs-studio-18.0.1.tar.xz/UI/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/UI/CMakeLists.txt
Changed
endif()
find_package(Qt5Widgets ${FIND_MODE})
+find_package(FFmpeg REQUIRED COMPONENTS avcodec avutil avformat)
if(NOT Qt5Widgets_FOUND)
if (ENABLE_UI)
endif()
endif()
+
+include_directories(${FFMPEG_INCLUDE_DIRS})
include_directories(SYSTEM "obs-frontend-api")
include_directories(SYSTEM "${CMAKE_SOURCE_DIR}/libobs")
+include_directories(SYSTEM "${CMAKE_SOURCE_DIR}/deps/libff")
find_package(Libcurl REQUIRED)
include_directories(${LIBCURL_INCLUDE_DIRS})
Qt5::X11Extras)
endif()
+set(obs_libffutil_SOURCES
+ ../deps/libff/libff/ff-util.c
+ )
+set(obs_libffutil_HEADERS
+ ../deps/libff/libff/ff-util.h
+ )
+
+if(MSVC)
+ set_source_files_properties(
+ ../deps/libff/libff/ff-util.c
+ PROPERTIES COMPILE_FLAGS -Dinline=__inline
+ )
+endif()
+
set(obs_SOURCES
${obs_PLATFORM_SOURCES}
+ ${obs_libffutil_SOURCES}
obs-app.cpp
api-interface.cpp
window-basic-main.cpp
set(obs_HEADERS
${obs_PLATFORM_HEADERS}
+ ${obs_libffutil_HEADERS}
obs-app.hpp
platform.hpp
window-main.hpp
target_link_libraries(obs
libobs
- libff
Qt5::Widgets
obs-frontend-api
+ ${FFMPEG_LIBRARIES}
${LIBCURL_LIBRARIES}
${obs_PLATFORM_LIBRARIES})
obs-studio-18.0.1.tar.xz/UI/api-interface.cpp -> obs-studio-18.0.2.tar.xz/UI/api-interface.cpp
Changed
App()->PopUITranslation();
}
+ void obs_frontend_set_streaming_service(obs_service_t *service) override
+ {
+ main->SetService(service);
+ }
+
+ obs_service_t *obs_frontend_get_streaming_service(void) override
+ {
+ return main->GetService();
+ }
+
+ void obs_frontend_save_streaming_service(void) override
+ {
+ main->SaveService();
+ }
+
void on_load(obs_data_t *settings) override
{
for (auto cb : saveCallbacks)
obs-studio-18.0.1.tar.xz/UI/data/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/UI/data/locale/en-US.ini
Changed
ReplayBuffer="Replay Buffer"
Import="Import"
Export="Export"
+Copy="Copy"
+Paste="Paste"
+PasteReference="Paste (Reference)"
+PasteDuplicate="Paste (Duplicate)"
+RemuxRecordings="Remux Recordings"
+
+# copy filters
+Copy.Filters="Copy Filters"
+Paste.Filters="Paste Filters"
# updater
Updater.Title="New update available"
OutputWarnings.NoTracksSelected="You must select at least one track"
OutputWarnings.MultiTrackRecording="Warning: Certain formats (such as FLV) do not support multiple tracks per recording"
OutputWarnings.MP4Recording="Warning: Recordings saved to MP4 will be unrecoverable if the file cannot be finalized (e.g. as a result of BSODs, power losses, etc.). If you want to record multiple audio tracks consider using MKV and remux the recording to mp4 after it is finished (File->Remux Recordings)"
+
+# deleting final scene
+FinalScene.Title="Delete Scene"
+FinalScene.Text="There needs to be at least one scene."
obs-studio-18.0.1.tar.xz/UI/forms/OBSBasic.ui -> obs-studio-18.0.2.tar.xz/UI/forms/OBSBasic.ui
Changed
<addaction name="actionScaleCanvas"/>
<addaction name="actionScaleOutput"/>
</widget>
+ <action name="actionCopySource">
+ <property name="text">
+ <string>Copy</string>
+ </property>
+ <property name="shortcut">
+ <string>Ctrl+C</string>
+ </property>
+ </action>
+ <action name="actionPasteRef">
+ <property name="enabled">
+ <bool>false</bool>
+ </property>
+ <property name="text">
+ <string>PasteReference</string>
+ </property>
+ <property name="iconText">
+ <string>PasteReference</string>
+ </property>
+ <property name="toolTip">
+ <string>PasteReference</string>
+ </property>
+ <property name="shortcut">
+ <string>Ctrl+V</string>
+ </property>
+ </action>
+ <action name="actionCopyFilters">
+ <property name="text">
+ <string>Copy.Filters</string>
+ </property>
+ </action>
+ <action name="actionPasteFilters">
+ <property name="enabled">
+ <bool>false</bool>
+ </property>
+ <property name="text">
+ <string>Paste.Filters</string>
+ </property>
+ </action>
+ <addaction name="actionCopySource"/>
+ <addaction name="actionPasteRef"/>
+ <addaction name="actionPasteDup"/>
+ <addaction name="separator"/>
+ <addaction name="actionCopyFilters"/>
+ <addaction name="actionPasteFilters"/>
+ <addaction name="separator"/>
<addaction name="transformMenu"/>
<addaction name="orderMenu"/>
<addaction name="scalingMenu"/>
<property name="text">
<string>Basic.MainMenu.Edit.Transform.EditTransform</string>
</property>
+ <property name="shortcut">
+ <string>Ctrl+E</string>
+ </property>
</action>
<action name="actionCopyTransform">
<property name="text">
<string>Basic.MainMenu.Edit.Scale.Output</string>
</property>
</action>
+ <action name="actionPasteDup">
+ <property name="text">
+ <string>PasteDuplicate</string>
+ </property>
+ </action>
</widget>
<customwidgets>
<customwidget>
obs-studio-18.0.1.tar.xz/UI/forms/OBSRemux.ui -> obs-studio-18.0.2.tar.xz/UI/forms/OBSRemux.ui
Changed
</rect>
</property>
<property name="windowTitle">
- <string>Dialog</string>
+ <string>RemuxRecordings</string>
</property>
- <widget class="QProgressBar" name="progressBar">
- <property name="geometry">
- <rect>
- <x>10</x>
- <y>90</y>
- <width>351</width>
- <height>23</height>
- </rect>
- </property>
- <property name="value">
- <number>24</number>
- </property>
- </widget>
- <widget class="QWidget" name="formLayoutWidget">
- <property name="geometry">
- <rect>
- <x>10</x>
- <y>10</y>
- <width>471</width>
- <height>71</height>
- </rect>
- </property>
- <layout class="QFormLayout" name="formLayout">
- <property name="fieldGrowthPolicy">
- <enum>QFormLayout::AllNonFixedFieldsGrow</enum>
- </property>
- <property name="verticalSpacing">
- <number>6</number>
- </property>
+ <layout class="QGridLayout" name="formLayout">
<item row="1" column="0">
<widget class="QLabel" name="label">
<property name="text">
<layout class="QHBoxLayout" name="horizontalLayout_2">
<item>
<widget class="QLineEdit" name="sourceFile">
- <property name="sizePolicy">
- <sizepolicy hsizetype="Expanding" vsizetype="Preferred">
- <horstretch>0</horstretch>
- <verstretch>0</verstretch>
- </sizepolicy>
- </property>
</widget>
</item>
<item>
<layout class="QHBoxLayout" name="horizontalLayout_3">
<item>
<widget class="QLineEdit" name="targetFile">
- <property name="sizePolicy">
- <sizepolicy hsizetype="Expanding" vsizetype="Preferred">
- <horstretch>0</horstretch>
- <verstretch>0</verstretch>
- </sizepolicy>
- </property>
</widget>
</item>
<item>
</item>
</layout>
</item>
+ <item row="3" column="0" colspan="2">
+ <widget class="QProgressBar" name="progressBar">
+ <property name="value">
+ <number>24</number>
+ </property>
+ </widget>
+ </item>
+ <item row="4" column="1">
+ <layout class="QHBoxLayout" name="horizontalLayout_4">
+ <item>
+ <widget class="QDialogButtonBox" name="buttonBox">
+ <property name="standardButtons">
+ <set>QDialogButtonBox::Ok|QDialogButtonBox::Close</set>
+ </property>
+ </widget>
+ </item>
+ </layout>
+ </item>
</layout>
- </widget>
- <widget class="QPushButton" name="remux">
- <property name="geometry">
- <rect>
- <x>370</x>
- <y>90</y>
- <width>111</width>
- <height>23</height>
- </rect>
- </property>
- <property name="text">
- <string>Remux.Remux</string>
- </property>
- </widget>
</widget>
<resources/>
<connections/>
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/CMakeLists.txt
Changed
set(frontend-tools_PLATFORM_SOURCES
${frontend-tools_PLATFORM_SOURCES}
captions.cpp
- captions-stream.cpp)
+ captions-handler.cpp
+ captions-mssapi.cpp
+ captions-mssapi-stream.cpp)
set(frontend-tools_PLATFORM_HEADERS
captions.hpp
- captions-stream.hpp)
+ captions-handler.hpp
+ captions-mssapi.hpp
+ captions-mssapi-stream.hpp)
set(frontend-tools_PLATFORM_UI
forms/captions.ui)
endif()
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-handler.cpp
Added
+#include "captions-handler.hpp"
+
+captions_handler::captions_handler(
+ captions_cb callback,
+ enum audio_format format,
+ uint32_t sample_rate)
+ : cb(callback)
+{
+ if (!reset_resampler(format, sample_rate))
+ throw CAPTIONS_ERROR_GENERIC_FAIL;
+}
+
+bool captions_handler::reset_resampler(
+ enum audio_format format,
+ uint32_t sample_rate)
+try {
+ obs_audio_info ai;
+ if (!obs_get_audio_info(&ai))
+ throw std::string("Failed to get OBS audio info");
+
+ resample_info src = {
+ ai.samples_per_sec,
+ AUDIO_FORMAT_FLOAT_PLANAR,
+ ai.speakers
+ };
+ resample_info dst = {
+ sample_rate,
+ format,
+ SPEAKERS_MONO
+ };
+
+ if (!resampler.reset(dst, src))
+ throw std::string("Failed to create audio resampler");
+
+ return true;
+
+} catch (std::string text) {
+ blog(LOG_WARNING, "%s: %s", __FUNCTION__, text.c_str());
+ return false;
+}
+
+void captions_handler::push_audio(const audio_data *audio)
+{
+ uint8_t *out[MAX_AV_PLANES];
+ uint32_t frames;
+ uint64_t ts_offset;
+ bool success;
+
+ success = audio_resampler_resample(resampler,
+ out, &frames, &ts_offset,
+ (const uint8_t *const *)audio->data, audio->frames);
+ if (success)
+ pcm_data(out[0], frames);
+}
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-handler.hpp
Added
+#pragma once
+
+#include <media-io/audio-resampler.h>
+#include <obs-module.h>
+#include <functional>
+#include <string>
+
+class resampler_obj {
+ audio_resampler_t *resampler = nullptr;
+
+public:
+ inline ~resampler_obj()
+ {
+ audio_resampler_destroy(resampler);
+ }
+
+ inline bool reset(const resample_info &dst, const resample_info &src)
+ {
+ audio_resampler_destroy(resampler);
+ resampler = audio_resampler_create(&dst, &src);
+ return !!resampler;
+ }
+
+ inline operator audio_resampler_t*() {return resampler;}
+};
+
+/* ------------------------------------------------------------------------- */
+
+typedef std::function<void (const std::string &)> captions_cb;
+
+#define captions_error(s) std::string(obs_module_text("Captions.Error." ## s))
+#define CAPTIONS_ERROR_GENERIC_FAIL captions_error("GenericFail")
+
+/* ------------------------------------------------------------------------- */
+
+class captions_handler {
+ captions_cb cb;
+ resampler_obj resampler;
+
+protected:
+ inline void callback(const std::string &text)
+ {
+ cb(text);
+ }
+
+ virtual void pcm_data(const void *data, size_t frames)=0;
+
+ /* always resamples to 1 channel */
+ bool reset_resampler(enum audio_format format, uint32_t sample_rate);
+
+public:
+ /* throw std::string for errors shown to users */
+ captions_handler(
+ captions_cb callback,
+ enum audio_format format,
+ uint32_t sample_rate);
+ virtual ~captions_handler() {}
+
+ void push_audio(const audio_data *audio);
+};
+
+/* ------------------------------------------------------------------------- */
+
+struct captions_handler_info {
+ std::string (*name)(void);
+ captions_handler *(*create)(captions_cb cb, const std::string &lang);
+};
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-mssapi-stream.cpp
Added
+#include "captions-mssapi-stream.hpp"
+#include "captions-mssapi.hpp"
+#include <mmreg.h>
+#include <util/windows/CoTaskMemPtr.hpp>
+#include <util/threading.h>
+#include <util/base.h>
+
+using namespace std;
+
+#if 0
+#define debugfunc(format, ...) blog(LOG_DEBUG, "[Captions] %s(" format ")", \
+ __FUNCTION__, ##__VA_ARGS__)
+#else
+#define debugfunc(format, ...)
+#endif
+
+CaptionStream::CaptionStream(DWORD samplerate_, mssapi_captions *handler_) :
+ handler(handler_),
+ samplerate(samplerate_),
+ event(CreateEvent(nullptr, false, false, nullptr))
+{
+ buf_info.ulMsMinNotification = 50;
+ buf_info.ulMsBufferSize = 500;
+ buf_info.ulMsEventBias = 0;
+
+ format.wFormatTag = WAVE_FORMAT_PCM;
+ format.nChannels = 1;
+ format.nSamplesPerSec = 16000;
+ format.nAvgBytesPerSec = format.nSamplesPerSec * sizeof(uint16_t);
+ format.nBlockAlign = 2;
+ format.wBitsPerSample = 16;
+ format.cbSize = sizeof(format);
+}
+
+void CaptionStream::Stop()
+{
+ {
+ lock_guard<mutex> lock(m);
+ circlebuf_free(buf);
+ }
+
+ cv.notify_one();
+}
+
+void CaptionStream::PushAudio(const void *data, size_t frames)
+{
+ bool ready = false;
+
+ lock_guard<mutex> lock(m);
+ circlebuf_push_back(buf, data, frames * sizeof(int16_t));
+ write_pos += frames * sizeof(int16_t);
+
+ if (wait_size && buf->size >= wait_size)
+ ready = true;
+ if (ready)
+ cv.notify_one();
+}
+
+// IUnknown methods
+
+STDMETHODIMP CaptionStream::QueryInterface(REFIID riid, void **ppv)
+{
+ if (riid == IID_IUnknown) {
+ AddRef();
+ *ppv = this;
+
+ } else if (riid == IID_IStream) {
+ AddRef();
+ *ppv = (IStream*)this;
+
+ } else if (riid == IID_ISpStreamFormat) {
+ AddRef();
+ *ppv = (ISpStreamFormat*)this;
+
+ } else if (riid == IID_ISpAudio) {
+ AddRef();
+ *ppv = (ISpAudio*)this;
+
+ } else {
+ *ppv = nullptr;
+ return E_NOINTERFACE;
+ }
+
+ return NOERROR;
+}
+
+STDMETHODIMP_(ULONG) CaptionStream::AddRef()
+{
+ return (ULONG)os_atomic_inc_long(&refs);
+}
+
+STDMETHODIMP_(ULONG) CaptionStream::Release()
+{
+ ULONG new_refs = (ULONG)os_atomic_dec_long(&refs);
+ if (!new_refs)
+ delete this;
+
+ return new_refs;
+}
+
+// ISequentialStream methods
+
+STDMETHODIMP CaptionStream::Read(void *data, ULONG bytes, ULONG *read_bytes)
+{
+ HRESULT hr = S_OK;
+ size_t cur_size;
+
+ debugfunc("data, %lu, read_bytes", bytes);
+ if (!data)
+ return STG_E_INVALIDPOINTER;
+
+ {
+ lock_guard<mutex> lock1(m);
+ wait_size = bytes;
+ cur_size = buf->size;
+ }
+
+ unique_lock<mutex> lock(m);
+
+ if (bytes > cur_size)
+ cv.wait(lock);
+
+ if (bytes > (ULONG)buf->size) {
+ bytes = (ULONG)buf->size;
+ hr = S_FALSE;
+ }
+ if (bytes)
+ circlebuf_pop_front(buf, data, bytes);
+ if (read_bytes)
+ *read_bytes = bytes;
+
+ wait_size = 0;
+ pos.QuadPart += bytes;
+ return hr;
+}
+
+STDMETHODIMP CaptionStream::Write(const void *, ULONG bytes,
+ ULONG*)
+{
+ debugfunc("data, %lu, written_bytes", bytes);
+ UNUSED_PARAMETER(bytes);
+
+ return STG_E_INVALIDFUNCTION;
+}
+
+// IStream methods
+
+STDMETHODIMP CaptionStream::Seek(LARGE_INTEGER move, DWORD origin,
+ ULARGE_INTEGER *new_pos)
+{
+ debugfunc("%lld, %lx, new_pos", move, origin);
+ UNUSED_PARAMETER(move);
+ UNUSED_PARAMETER(origin);
+
+ if (!new_pos)
+ return E_POINTER;
+
+ if (origin != SEEK_CUR || move.QuadPart != 0)
+ return E_NOTIMPL;
+
+ *new_pos = pos;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::SetSize(ULARGE_INTEGER new_size)
+{
+ debugfunc("%llu", new_size);
+ UNUSED_PARAMETER(new_size);
+ return STG_E_INVALIDFUNCTION;
+}
+
+STDMETHODIMP CaptionStream::CopyTo(IStream *stream, ULARGE_INTEGER bytes,
+ ULARGE_INTEGER *read_bytes,
+ ULARGE_INTEGER *written_bytes)
+{
+ HRESULT hr;
+
+ debugfunc("stream, %llu, read_bytes, written_bytes", bytes);
+
+ if (!stream)
+ return STG_E_INVALIDPOINTER;
+
+ ULONG written = 0;
+ if (bytes.QuadPart > (ULONGLONG)buf->size)
+ bytes.QuadPart = (ULONGLONG)buf->size;
+
+ lock_guard<mutex> lock(m);
+ temp_buf.resize((size_t)bytes.QuadPart);
+ circlebuf_peek_front(buf, &temp_buf[0], (size_t)bytes.QuadPart);
+
+ hr = stream->Write(temp_buf.data(), (ULONG)bytes.QuadPart, &written);
+
+ if (read_bytes)
+ *read_bytes = bytes;
+ if (written_bytes)
+ written_bytes->QuadPart = written;
+
+ return hr;
+}
+
+STDMETHODIMP CaptionStream::Commit(DWORD commit_flags)
+{
+ debugfunc("%lx", commit_flags);
+ UNUSED_PARAMETER(commit_flags);
+ /* TODO? */
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::Revert(void)
+{
+ debugfunc("");
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::LockRegion(ULARGE_INTEGER offset,
+ ULARGE_INTEGER size, DWORD type)
+{
+ debugfunc("%llu, %llu, %ld", offset, size, type);
+ UNUSED_PARAMETER(offset);
+ UNUSED_PARAMETER(size);
+ UNUSED_PARAMETER(type);
+ /* TODO? */
+ return STG_E_INVALIDFUNCTION;
+}
+
+STDMETHODIMP CaptionStream::UnlockRegion(ULARGE_INTEGER offset,
+ ULARGE_INTEGER size, DWORD type)
+{
+ debugfunc("%llu, %llu, %ld", offset, size, type);
+ UNUSED_PARAMETER(offset);
+ UNUSED_PARAMETER(size);
+ UNUSED_PARAMETER(type);
+ /* TODO? */
+ return STG_E_INVALIDFUNCTION;
+}
+
+static const wchar_t *stat_name = L"Caption stream";
+
+STDMETHODIMP CaptionStream::Stat(STATSTG *stg, DWORD flag)
+{
+ debugfunc("stg, %lu", flag);
+
+ if (!stg)
+ return E_POINTER;
+
+ lock_guard<mutex> lock(m);
+ *stg = {};
+ stg->type = STGTY_STREAM;
+ stg->cbSize.QuadPart = (ULONGLONG)buf->size;
+
+ if (flag == STATFLAG_DEFAULT) {
+ stg->pwcsName = (wchar_t*)CoTaskMemAlloc(sizeof(stat_name));
+ memcpy(stg->pwcsName, stat_name, sizeof(stat_name));
+ }
+
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::Clone(IStream **stream)
+{
+ debugfunc("stream");
+ *stream = nullptr;
+ return E_NOTIMPL;
+}
+
+// ISpStreamFormat methods
+
+STDMETHODIMP CaptionStream::GetFormat(GUID *guid,
+ WAVEFORMATEX **co_mem_wfex_out)
+{
+ debugfunc("guid, co_mem_wfex_out");
+
+ if (!guid || !co_mem_wfex_out)
+ return E_POINTER;
+
+ if (format.wFormatTag == 0) {
+ *co_mem_wfex_out = nullptr;
+ return S_OK;
+ }
+
+ void *wfex = CoTaskMemAlloc(sizeof(format));
+ memcpy(wfex, &format, sizeof(format));
+
+ *co_mem_wfex_out = (WAVEFORMATEX*)wfex;
+ return S_OK;
+}
+
+// ISpAudio methods
+
+STDMETHODIMP CaptionStream::SetState(SPAUDIOSTATE state_, ULONGLONG)
+{
+ debugfunc("%lu, reserved", state_);
+ state = state_;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::SetFormat(REFGUID guid_ref,
+ const WAVEFORMATEX *wfex)
+{
+ debugfunc("guid, wfex");
+ if (!wfex)
+ return E_INVALIDARG;
+
+ if (guid_ref == SPDFID_WaveFormatEx) {
+ lock_guard<mutex> lock(m);
+ memcpy(&format, wfex, sizeof(format));
+ if (!handler->reset_resampler(AUDIO_FORMAT_16BIT,
+ wfex->nSamplesPerSec))
+ return E_FAIL;
+
+ /* 50 msec */
+ DWORD size = format.nSamplesPerSec / 20;
+ DWORD byte_size = size * format.nBlockAlign;
+ circlebuf_reserve(buf, (size_t)byte_size);
+ }
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::GetStatus(SPAUDIOSTATUS *status)
+{
+ debugfunc("status");
+
+ if (!status)
+ return E_POINTER;
+
+ /* TODO? */
+ lock_guard<mutex> lock(m);
+ *status = {};
+ status->cbNonBlockingIO = (ULONG)buf->size;
+ status->State = state;
+ status->CurSeekPos = pos.QuadPart;
+ status->CurDevicePos = write_pos;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::SetBufferInfo(const SPAUDIOBUFFERINFO *buf_info_)
+{
+ debugfunc("buf_info");
+
+ /* TODO */
+ buf_info = *buf_info_;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::GetBufferInfo(SPAUDIOBUFFERINFO *buf_info_)
+{
+ debugfunc("buf_info");
+ if (!buf_info_)
+ return E_POINTER;
+
+ *buf_info_ = buf_info;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::GetDefaultFormat(GUID *format,
+ WAVEFORMATEX **co_mem_wfex_out)
+{
+ debugfunc("format, co_mem_wfex_out");
+
+ if (!format || !co_mem_wfex_out)
+ return E_POINTER;
+
+ void *wfex = CoTaskMemAlloc(sizeof(format));
+ memcpy(wfex, &format, sizeof(format));
+
+ *format = SPDFID_WaveFormatEx;
+ *co_mem_wfex_out = (WAVEFORMATEX*)wfex;
+ return S_OK;
+}
+
+STDMETHODIMP_(HANDLE) CaptionStream::EventHandle(void)
+{
+ debugfunc("");
+ return event;
+}
+
+STDMETHODIMP CaptionStream::GetVolumeLevel(ULONG *level)
+{
+ debugfunc("level");
+ if (!level)
+ return E_POINTER;
+
+ *level = vol;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::SetVolumeLevel(ULONG level)
+{
+ debugfunc("%lu", level);
+ vol = level;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::GetBufferNotifySize(ULONG *size)
+{
+ debugfunc("size");
+ if (!size)
+ return E_POINTER;
+ *size = notify_size;
+ return S_OK;
+}
+
+STDMETHODIMP CaptionStream::SetBufferNotifySize(ULONG size)
+{
+ debugfunc("%lu", size);
+ notify_size = size;
+ return S_OK;
+}
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-mssapi-stream.hpp
Added
+#pragma once
+
+#include <windows.h>
+#include <sapi.h>
+#include <condition_variable>
+#include <mutex>
+#include <vector>
+#include <obs.h>
+#include <util/circlebuf.h>
+#include <util/windows/WinHandle.hpp>
+
+#include <fstream>
+
+class CircleBuf {
+ circlebuf buf = {};
+public:
+ inline ~CircleBuf() {circlebuf_free(&buf);}
+ inline operator circlebuf*() {return &buf;}
+ inline circlebuf *operator->() {return &buf;}
+};
+
+class mssapi_captions;
+
+class CaptionStream : public ISpAudio {
+ volatile long refs = 1;
+ SPAUDIOBUFFERINFO buf_info = {};
+ mssapi_captions *handler;
+ ULONG notify_size = 0;
+ SPAUDIOSTATE state;
+ WinHandle event;
+ ULONG vol = 0;
+
+ std::condition_variable cv;
+ std::mutex m;
+ std::vector<int16_t> temp_buf;
+ WAVEFORMATEX format = {};
+
+ CircleBuf buf;
+ ULONG wait_size = 0;
+ DWORD samplerate = 0;
+ ULARGE_INTEGER pos = {};
+ ULONGLONG write_pos = 0;
+
+public:
+ CaptionStream(DWORD samplerate, mssapi_captions *handler_);
+
+ void Stop();
+ void PushAudio(const void *data, size_t frames);
+
+ // IUnknown methods
+ STDMETHODIMP QueryInterface(REFIID riid, void **ppv) override;
+ STDMETHODIMP_(ULONG) AddRef() override;
+ STDMETHODIMP_(ULONG) Release() override;
+
+ // ISequentialStream methods
+ STDMETHODIMP Read(void *data, ULONG bytes, ULONG *read_bytes) override;
+ STDMETHODIMP Write(const void *data, ULONG bytes, ULONG *written_bytes)
+ override;
+
+ // IStream methods
+ STDMETHODIMP Seek(LARGE_INTEGER move, DWORD origin,
+ ULARGE_INTEGER *new_pos) override;
+ STDMETHODIMP SetSize(ULARGE_INTEGER new_size) override;
+ STDMETHODIMP CopyTo(IStream *stream, ULARGE_INTEGER bytes,
+ ULARGE_INTEGER *read_bytes,
+ ULARGE_INTEGER *written_bytes) override;
+ STDMETHODIMP Commit(DWORD commit_flags) override;
+ STDMETHODIMP Revert(void) override;
+ STDMETHODIMP LockRegion(ULARGE_INTEGER offset, ULARGE_INTEGER size,
+ DWORD type) override;
+ STDMETHODIMP UnlockRegion(ULARGE_INTEGER offset, ULARGE_INTEGER size,
+ DWORD type) override;
+ STDMETHODIMP Stat(STATSTG *stg, DWORD flags) override;
+ STDMETHODIMP Clone(IStream **stream) override;
+
+ // ISpStreamFormat methods
+ STDMETHODIMP GetFormat(GUID *guid, WAVEFORMATEX **co_mem_wfex_out)
+ override;
+
+ // ISpAudio methods
+ STDMETHODIMP SetState(SPAUDIOSTATE state, ULONGLONG reserved) override;
+ STDMETHODIMP SetFormat(REFGUID guid_ref, const WAVEFORMATEX *wfex)
+ override;
+ STDMETHODIMP GetStatus(SPAUDIOSTATUS *status) override;
+ STDMETHODIMP SetBufferInfo(const SPAUDIOBUFFERINFO *buf_info) override;
+ STDMETHODIMP GetBufferInfo(SPAUDIOBUFFERINFO *buf_info) override;
+ STDMETHODIMP GetDefaultFormat(GUID *format,
+ WAVEFORMATEX **co_mem_wfex_out) override;
+ STDMETHODIMP_(HANDLE) EventHandle(void) override;
+ STDMETHODIMP GetVolumeLevel(ULONG *level) override;
+ STDMETHODIMP SetVolumeLevel(ULONG level) override;
+ STDMETHODIMP GetBufferNotifySize(ULONG *size) override;
+ STDMETHODIMP SetBufferNotifySize(ULONG size) override;
+};
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-mssapi.cpp
Added
+#include "captions-mssapi.hpp"
+
+#define do_log(type, format, ...) blog(type, "[Captions] " format, \
+ ##__VA_ARGS__)
+
+#define error(format, ...) do_log(LOG_ERROR, format, ##__VA_ARGS__)
+#define debug(format, ...) do_log(LOG_DEBUG, format, ##__VA_ARGS__)
+
+mssapi_captions::mssapi_captions(
+ captions_cb callback,
+ const std::string &lang) try
+ : captions_handler(callback, AUDIO_FORMAT_16BIT, 16000)
+{
+ HRESULT hr;
+
+ std::wstring wlang;
+ wlang.resize(lang.size());
+
+ for (size_t i = 0; i < lang.size(); i++)
+ wlang[i] = (wchar_t)lang[i];
+
+ LCID lang_id = LocaleNameToLCID(wlang.c_str(), 0);
+
+ wchar_t lang_str[32];
+ _snwprintf(lang_str, 31, L"language=%x", (int)lang_id);
+
+ stop = CreateEvent(nullptr, false, false, nullptr);
+ if (!stop.Valid())
+ throw "Failed to create event";
+
+ hr = SpFindBestToken(SPCAT_RECOGNIZERS, lang_str, nullptr, &token);
+ if (FAILED(hr))
+ throw HRError("SpFindBestToken failed", hr);
+
+ hr = CoCreateInstance(CLSID_SpInprocRecognizer, nullptr, CLSCTX_ALL,
+ __uuidof(ISpRecognizer), (void**)&recognizer);
+ if (FAILED(hr))
+ throw HRError("CoCreateInstance for recognizer failed", hr);
+
+ hr = recognizer->SetRecognizer(token);
+ if (FAILED(hr))
+ throw HRError("SetRecognizer failed", hr);
+
+ hr = recognizer->SetRecoState(SPRST_INACTIVE);
+ if (FAILED(hr))
+ throw HRError("SetRecoState(SPRST_INACTIVE) failed", hr);
+
+ hr = recognizer->CreateRecoContext(&context);
+ if (FAILED(hr))
+ throw HRError("CreateRecoContext failed", hr);
+
+ ULONGLONG interest = SPFEI(SPEI_RECOGNITION) |
+ SPFEI(SPEI_END_SR_STREAM);
+ hr = context->SetInterest(interest, interest);
+ if (FAILED(hr))
+ throw HRError("SetInterest failed", hr);
+
+ hr = context->SetNotifyWin32Event();
+ if (FAILED(hr))
+ throw HRError("SetNotifyWin32Event", hr);
+
+ notify = context->GetNotifyEventHandle();
+ if (notify == INVALID_HANDLE_VALUE)
+ throw HRError("GetNotifyEventHandle failed", E_NOINTERFACE);
+
+ size_t sample_rate = audio_output_get_sample_rate(obs_get_audio());
+ audio = new CaptionStream((DWORD)sample_rate, this);
+ audio->Release();
+
+ hr = recognizer->SetInput(audio, false);
+ if (FAILED(hr))
+ throw HRError("SetInput failed", hr);
+
+ hr = context->CreateGrammar(1, &grammar);
+ if (FAILED(hr))
+ throw HRError("CreateGrammar failed", hr);
+
+ hr = grammar->LoadDictation(nullptr, SPLO_STATIC);
+ if (FAILED(hr))
+ throw HRError("LoadDictation failed", hr);
+
+ try {
+ t = std::thread([this] () {main_thread();});
+ } catch (...) {
+ throw "Failed to create thread";
+ }
+
+} catch (const char *err) {
+ blog(LOG_WARNING, "%s: %s", __FUNCTION__, err);
+ throw CAPTIONS_ERROR_GENERIC_FAIL;
+
+} catch (HRError err) {
+ blog(LOG_WARNING, "%s: %s (%lX)", __FUNCTION__, err.str, err.hr);
+ throw CAPTIONS_ERROR_GENERIC_FAIL;
+}
+
+mssapi_captions::~mssapi_captions()
+{
+ if (t.joinable()) {
+ SetEvent(stop);
+ t.join();
+ }
+}
+
+void mssapi_captions::main_thread()
+try {
+ HRESULT hr;
+
+ os_set_thread_name(__FUNCTION__);
+
+ hr = grammar->SetDictationState(SPRS_ACTIVE);
+ if (FAILED(hr))
+ throw HRError("SetDictationState failed", hr);
+
+ hr = recognizer->SetRecoState(SPRST_ACTIVE);
+ if (FAILED(hr))
+ throw HRError("SetRecoState(SPRST_ACTIVE) failed", hr);
+
+ HANDLE events[] = {notify, stop};
+
+ started = true;
+
+ for (;;) {
+ DWORD ret = WaitForMultipleObjects(2, events, false, INFINITE);
+ if (ret != WAIT_OBJECT_0)
+ break;
+
+ CSpEvent event;
+ bool exit = false;
+
+ while (event.GetFrom(context) == S_OK) {
+ if (event.eEventId == SPEI_RECOGNITION) {
+ ISpRecoResult *result = event.RecoResult();
+
+ CoTaskMemPtr<wchar_t> text;
+ hr = result->GetText((ULONG)-1, (ULONG)-1,
+ true, &text, nullptr);
+ if (FAILED(hr))
+ continue;
+
+ char text_utf8[512];
+ os_wcs_to_utf8(text, 0, text_utf8, 512);
+
+ callback(text_utf8);
+
+ blog(LOG_DEBUG, "\"%s\"", text_utf8);
+
+ } else if (event.eEventId == SPEI_END_SR_STREAM) {
+ exit = true;
+ break;
+ }
+ }
+
+ if (exit)
+ break;
+ }
+
+ audio->Stop();
+
+} catch (HRError err) {
+ blog(LOG_WARNING, "%s failed: %s (%lX)", __FUNCTION__, err.str, err.hr);
+}
+
+void mssapi_captions::pcm_data(const void *data, size_t frames)
+{
+ if (started)
+ audio->PushAudio(data, frames);
+}
+
+captions_handler_info mssapi_info = {
+ [] () -> std::string
+ {
+ return "Microsoft Speech-to-Text";
+ },
+ [] (captions_cb cb, const std::string &lang) -> captions_handler *
+ {
+ return new mssapi_captions(cb, lang);
+ }
+};
obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions-mssapi.hpp
Added
+#pragma once
+
+#include "captions-handler.hpp"
+#include "captions-mssapi-stream.hpp"
+#include <util/windows/HRError.hpp>
+#include <util/windows/ComPtr.hpp>
+#include <util/windows/WinHandle.hpp>
+#include <util/windows/CoTaskMemPtr.hpp>
+#include <util/threading.h>
+#include <util/platform.h>
+#include <sphelper.h>
+
+#include <obs.hpp>
+
+#include <thread>
+
+class mssapi_captions : public captions_handler {
+ friend class CaptionStream;
+
+ ComPtr<CaptionStream> audio;
+ ComPtr<ISpObjectToken> token;
+ ComPtr<ISpRecoGrammar> grammar;
+ ComPtr<ISpRecognizer> recognizer;
+ ComPtr<ISpRecoContext> context;
+
+ HANDLE notify;
+ WinHandle stop;
+ std::thread t;
+ bool started = false;
+
+ void main_thread();
+
+public:
+ mssapi_captions(captions_cb callback, const std::string &lang);
+ virtual ~mssapi_captions();
+ virtual void pcm_data(const void *data, size_t frames) override;
+};
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/captions.cpp -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions.cpp
Changed
+#include <QMessageBox>
+
+#include <windows.h>
#include <obs-frontend-api.h>
-#include "captions-stream.hpp"
#include "captions.hpp"
+#include "captions-handler.hpp"
#include "tool-helpers.hpp"
-#include <sphelper.h>
#include <util/dstr.hpp>
#include <util/platform.h>
-#include <util/windows/HRError.hpp>
+#include <util/windows/WinHandle.hpp>
#include <util/windows/ComPtr.hpp>
-#include <util/windows/CoTaskMemPtr.hpp>
-#include <util/threading.h>
#include <obs-module.h>
+#include <sphelper.h>
+#include <unordered_map>
+#include <vector>
#include <string>
#include <thread>
#include <mutex>
+#include "captions-mssapi.hpp"
+
#define do_log(type, format, ...) blog(type, "[Captions] " format, \
##__VA_ARGS__)
-#define error(format, ...) do_log(LOG_ERROR, format, ##__VA_ARGS__)
+#define warn(format, ...) do_log(LOG_WARNING, format, ##__VA_ARGS__)
#define debug(format, ...) do_log(LOG_DEBUG, format, ##__VA_ARGS__)
using namespace std;
-struct obs_captions {
- thread th;
- recursive_mutex m;
- WinHandle stop_event;
+#define DEFAULT_HANDLER "mssapi"
+struct obs_captions {
+ string handler_id = DEFAULT_HANDLER;
string source_name;
OBSWeakSource source;
- LANGID lang_id;
+ unique_ptr<captions_handler> handler;
+ LANGID lang_id = GetUserDefaultUILanguage();
- void main_thread();
- void start();
- void stop();
+ std::unordered_map<std::string, captions_handler_info&> handler_types;
- inline obs_captions() :
- stop_event(CreateEvent(nullptr, false, false, nullptr)),
- lang_id(GetUserDefaultUILanguage())
+ inline void register_handler(const char *id,
+ captions_handler_info &info)
{
+ handler_types.emplace(id, info);
}
+ void start();
+ void stop();
+
+ obs_captions();
inline ~obs_captions() {stop();}
};
{
ui->setupUi(this);
- lock_guard<recursive_mutex> lock(captions->m);
-
auto cb = [this] (obs_source_t *source)
{
uint32_t caps = obs_source_get_output_flags(source);
return (*static_cast<cb_t*>(data))(source);}, &cb);
ui->source->blockSignals(false);
+ for (auto &ht : captions->handler_types) {
+ QString name = ht.second.name().c_str();
+ QString id = ht.first.c_str();
+ ui->provider->addItem(name, id);
+ }
+
+ QString qhandler_id = captions->handler_id.c_str();
+ int idx = ui->provider->findData(qhandler_id);
+ if (idx != -1)
+ ui->provider->setCurrentIndex(idx);
+
ui->enable->blockSignals(true);
- ui->enable->setChecked(captions->th.joinable());
+ ui->enable->setChecked(!!captions->handler);
ui->enable->blockSignals(false);
vector<locale_info> locales;
ui->language->setEnabled(false);
} else if (!set_language) {
- bool started = captions->th.joinable();
+ bool started = !!captions->handler;
if (started)
captions->stop();
- captions->m.lock();
captions->lang_id = locales[0].id;
- captions->m.unlock();
if (started)
captions->start();
void CaptionsDialog::on_source_currentIndexChanged(int)
{
- bool started = captions->th.joinable();
+ bool started = !!captions->handler;
if (started)
captions->stop();
- captions->m.lock();
captions->source_name = ui->source->currentText().toUtf8().constData();
captions->source = GetWeakSourceByName(captions->source_name.c_str());
- captions->m.unlock();
if (started)
captions->start();
void CaptionsDialog::on_enable_clicked(bool checked)
{
- if (checked)
+ if (checked) {
captions->start();
- else
+ if (!captions->handler) {
+ ui->enable->blockSignals(true);
+ ui->enable->setChecked(false);
+ ui->enable->blockSignals(false);
+ }
+ } else {
captions->stop();
+ }
}
void CaptionsDialog::on_language_currentIndexChanged(int)
{
- bool started = captions->th.joinable();
+ bool started = !!captions->handler;
if (started)
captions->stop();
- captions->m.lock();
captions->lang_id = (LANGID)ui->language->currentData().toInt();
- captions->m.unlock();
if (started)
captions->start();
}
-/* ------------------------------------------------------------------------- */
-
-void obs_captions::main_thread()
-try {
- ComPtr<CaptionStream> audio;
- ComPtr<ISpObjectToken> token;
- ComPtr<ISpRecoGrammar> grammar;
- ComPtr<ISpRecognizer> recognizer;
- ComPtr<ISpRecoContext> context;
- HRESULT hr;
-
- auto cb = [&] (const struct audio_data *audio_data,
- bool muted)
- {
- audio->PushAudio(audio_data, muted);
- };
-
- using cb_t = decltype(cb);
-
- auto pre_cb = [] (void *param, obs_source_t*,
- const struct audio_data *audio_data, bool muted)
- {
- return (*static_cast<cb_t*>(param))(audio_data, muted);
- };
-
- os_set_thread_name(__FUNCTION__);
-
- CoInitialize(nullptr);
-
- wchar_t lang_str[32];
- _snwprintf(lang_str, 31, L"language=%x", (int)captions->lang_id);
-
- hr = SpFindBestToken(SPCAT_RECOGNIZERS, lang_str, nullptr, &token);
- if (FAILED(hr))
- throw HRError("SpFindBestToken failed", hr);
-
- hr = CoCreateInstance(CLSID_SpInprocRecognizer, nullptr, CLSCTX_ALL,
- __uuidof(ISpRecognizer), (void**)&recognizer);
- if (FAILED(hr))
- throw HRError("CoCreateInstance for recognizer failed", hr);
-
- hr = recognizer->SetRecognizer(token);
- if (FAILED(hr))
- throw HRError("SetRecognizer failed", hr);
-
- hr = recognizer->SetRecoState(SPRST_INACTIVE);
- if (FAILED(hr))
- throw HRError("SetRecoState(SPRST_INACTIVE) failed", hr);
-
- hr = recognizer->CreateRecoContext(&context);
- if (FAILED(hr))
- throw HRError("CreateRecoContext failed", hr);
-
- ULONGLONG interest = SPFEI(SPEI_RECOGNITION) |
- SPFEI(SPEI_END_SR_STREAM);
- hr = context->SetInterest(interest, interest);
- if (FAILED(hr))
- throw HRError("SetInterest failed", hr);
-
- HANDLE notify;
-
- hr = context->SetNotifyWin32Event();
- if (FAILED(hr))
- throw HRError("SetNotifyWin32Event", hr);
-
- notify = context->GetNotifyEventHandle();
- if (notify == INVALID_HANDLE_VALUE)
- throw HRError("GetNotifyEventHandle failed", E_NOINTERFACE);
-
- size_t sample_rate = audio_output_get_sample_rate(obs_get_audio());
- audio = new CaptionStream((DWORD)sample_rate);
- audio->Release();
-
- hr = recognizer->SetInput(audio, false);
- if (FAILED(hr))
- throw HRError("SetInput failed", hr);
-
- hr = context->CreateGrammar(1, &grammar);
- if (FAILED(hr))
- throw HRError("CreateGrammar failed", hr);
-
- hr = grammar->LoadDictation(nullptr, SPLO_STATIC);
- if (FAILED(hr))
- throw HRError("LoadDictation failed", hr);
+void CaptionsDialog::on_provider_currentIndexChanged(int idx)
+{
+ bool started = !!captions->handler;
+ if (started)
+ captions->stop();
- hr = grammar->SetDictationState(SPRS_ACTIVE);
- if (FAILED(hr))
- throw HRError("SetDictationState failed", hr);
+ captions->handler_id =
+ ui->provider->itemData(idx).toString().toUtf8().constData();
- hr = recognizer->SetRecoState(SPRST_ACTIVE);
- if (FAILED(hr))
- throw HRError("SetRecoState(SPRST_ACTIVE) failed", hr);
+ if (started)
+ captions->start();
+}
- HANDLE events[] = {notify, stop_event};
+/* ------------------------------------------------------------------------- */
- {
- captions->source = GetWeakSourceByName(
- captions->source_name.c_str());
- OBSSource strong = OBSGetStrongRef(source);
- if (strong)
- obs_source_add_audio_capture_callback(strong,
- pre_cb, &cb);
+static void caption_text(const std::string &text)
+{
+ obs_output *output = obs_frontend_get_streaming_output();
+ if (output) {
+ obs_output_output_caption_text1(output, text.c_str());
+ obs_output_release(output);
}
+}
- for (;;) {
- DWORD ret = WaitForMultipleObjects(2, events, false, INFINITE);
- if (ret != WAIT_OBJECT_0)
- break;
-
- CSpEvent event;
- bool exit = false;
-
- while (event.GetFrom(context) == S_OK) {
- if (event.eEventId == SPEI_RECOGNITION) {
- ISpRecoResult *result = event.RecoResult();
+static void audio_capture(void*, obs_source_t*,
+ const struct audio_data *audio, bool)
+{
+ captions->handler->push_audio(audio);
+}
- CoTaskMemPtr<wchar_t> text;
- hr = result->GetText((ULONG)-1, (ULONG)-1,
- true, &text, nullptr);
- if (FAILED(hr))
- continue;
+void obs_captions::start()
+{
+ if (!captions->handler && valid_lang(lang_id)) {
+ wchar_t wname[256];
+
+ auto pair = handler_types.find(handler_id);
+ if (pair == handler_types.end()) {
+ warn("Failed to find handler '%s'",
+ handler_id.c_str());
+ return;
+ }
- char text_utf8[512];
- os_wcs_to_utf8(text, 0, text_utf8, 512);
+ if (!LCIDToLocaleName(lang_id, wname, 256, 0)) {
+ warn("Failed to get locale name: %d",
+ (int)GetLastError());
+ return;
+ }
- obs_output_t *output =
- obs_frontend_get_streaming_output();
- if (output)
- obs_output_output_caption_text1(output,
- text_utf8);
+ size_t len = (size_t)wcslen(wname);
- debug("\"%s\"", text_utf8);
+ string lang_name;
+ lang_name.resize(len);
- obs_output_release(output);
+ for (size_t i = 0; i < len; i++)
+ lang_name[i] = (char)wname[i];
- } else if (event.eEventId == SPEI_END_SR_STREAM) {
- exit = true;
- break;
- }
+ OBSSource s = OBSGetStrongRef(source);
+ if (!s) {
+ warn("Source invalid");
+ return;
}
- if (exit)
- break;
- }
+ try {
+ captions_handler *h = pair->second.create(caption_text,
+ lang_name);
+ handler.reset(h);
- {
- OBSSource strong = OBSGetStrongRef(source);
- if (strong)
- obs_source_remove_audio_capture_callback(strong,
- pre_cb, &cb);
- }
-
- audio->Stop();
+ OBSSource s = OBSGetStrongRef(source);
+ obs_source_add_audio_capture_callback(s,
+ audio_capture, nullptr);
- CoUninitialize();
+ } catch (std::string text) {
+ QWidget *window =
+ (QWidget*)obs_frontend_get_main_window();
-} catch (HRError err) {
- error("%s failed: %s (%lX)", __FUNCTION__, err.str, err.hr);
- CoUninitialize();
- captions->th.detach();
-}
+ warn("Failed to create handler: %s", text.c_str());
-void obs_captions::start()
-{
- if (!captions->th.joinable()) {
- ResetEvent(captions->stop_event);
+ QMessageBox::warning(window,
+ obs_module_text("Captions.Error.GenericFail"),
+ text.c_str());
- if (valid_lang(captions->lang_id))
- captions->th = thread([] () {captions->main_thread();});
+ }
}
}
void obs_captions::stop()
{
- if (!captions->th.joinable())
- return;
-
- SetEvent(captions->stop_event);
- captions->th.join();
+ OBSSource s = OBSGetStrongRef(source);
+ if (s)
+ obs_source_remove_audio_capture_callback(s,
+ audio_capture, nullptr);
+ handler.reset();
}
static bool get_locale_name(LANGID id, char *out)
/* ------------------------------------------------------------------------- */
+extern captions_handler_info mssapi_info;
+
+obs_captions::obs_captions()
+{
+ register_handler("mssapi", mssapi_info);
+}
+
+/* ------------------------------------------------------------------------- */
+
extern "C" void FreeCaptions()
{
delete captions;
static void save_caption_data(obs_data_t *save_data, bool saving, void*)
{
if (saving) {
- lock_guard<recursive_mutex> lock(captions->m);
obs_data_t *obj = obs_data_create();
obs_data_set_string(obj, "source",
captions->source_name.c_str());
- obs_data_set_bool(obj, "enabled", captions->th.joinable());
+ obs_data_set_bool(obj, "enabled", !!captions->handler);
obs_data_set_int(obj, "lang_id", captions->lang_id);
+ obs_data_set_string(obj, "provider",
+ captions->handler_id.c_str());
obs_data_set_obj(save_data, "captions", obj);
obs_data_release(obj);
} else {
captions->stop();
- captions->m.lock();
-
obs_data_t *obj = obs_data_get_obj(save_data, "captions");
if (!obj)
obj = obs_data_create();
obs_data_set_default_int(obj, "lang_id",
GetUserDefaultUILanguage());
+ obs_data_set_default_string(obj, "provider", DEFAULT_HANDLER);
bool enabled = obs_data_get_bool(obj, "enabled");
captions->source_name = obs_data_get_string(obj, "source");
captions->lang_id = (int)obs_data_get_int(obj, "lang_id");
+ captions->handler_id = obs_data_get_string(obj, "provider");
captions->source = GetWeakSourceByName(
captions->source_name.c_str());
obs_data_release(obj);
- captions->m.unlock();
-
if (enabled)
captions->start();
}
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/captions.hpp -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/captions.hpp
Changed
void on_source_currentIndexChanged(int idx);
void on_enable_clicked(bool checked);
void on_language_currentIndexChanged(int idx);
+ void on_provider_currentIndexChanged(int idx);
};
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/data/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/data/locale/en-US.ini
Changed
Captions="Captions (Experimental)"
Captions.AudioSource="Audio source"
Captions.CurrentSystemLanguage="Current System Language (%1)"
+Captions.Provider="Provider"
+Captions.Error.GenericFail="Failed to start captions"
OutputTimer="Output Timer"
OutputTimer.Stream="Stop streaming after:"
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/forms/captions.ui -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/forms/captions.ui
Changed
<x>0</x>
<y>0</y>
<width>519</width>
- <height>140</height>
+ <height>152</height>
</rect>
</property>
<property name="windowTitle">
<item row="2" column="1">
<widget class="QComboBox" name="language"/>
</item>
+ <item row="3" column="1">
+ <widget class="QComboBox" name="provider">
+ <property name="insertPolicy">
+ <enum>QComboBox::InsertAlphabetically</enum>
+ </property>
+ </widget>
+ </item>
+ <item row="3" column="0">
+ <widget class="QLabel" name="label_3">
+ <property name="text">
+ <string>Captions.Provider</string>
+ </property>
+ </widget>
+ </item>
</layout>
</item>
<item>
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/forms/output-timer.ui -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/forms/output-timer.ui
Changed
</property>
</widget>
</item>
+ <item row="6" column="8">
+ <widget class="QDialogButtonBox" name="buttonBox">
+ <property name="standardButtons">
+ <set>QDialogButtonBox::Close</set>
+ </property>
+ </widget>
+ </item>
</layout>
</widget>
<resources />
obs-studio-18.0.1.tar.xz/UI/frontend-plugins/frontend-tools/output-timer.cpp -> obs-studio-18.0.2.tar.xz/UI/frontend-plugins/frontend-tools/output-timer.cpp
Changed
SLOT(StreamingTimerButton()));
QObject::connect(ui->outputTimerRecord, SIGNAL(clicked()), this,
SLOT(RecordingTimerButton()));
+ QObject::connect(ui->buttonBox->button(QDialogButtonBox::Close),
+ SIGNAL(clicked()), this, SLOT(hide()));
streamingTimer = new QTimer(this);
streamingTimerDisplay = new QTimer(this);
obs-studio-18.0.1.tar.xz/UI/obs-app.cpp -> obs-studio-18.0.2.tar.xz/UI/obs-app.cpp
Changed
bool opt_studio_mode = false;
bool opt_start_replaybuffer = false;
bool opt_minimize_tray = false;
+bool opt_allow_opengl = false;
+bool opt_always_on_top = false;
string opt_starting_collection;
string opt_starting_profile;
string opt_starting_scene;
return ret;
}
-#define MAX_CRASH_REPORT_SIZE (50 * 1024)
+#define MAX_CRASH_REPORT_SIZE (150 * 1024)
#ifdef _WIN32
} else if (arg_is(argv[i], "--verbose", nullptr)) {
log_verbose = true;
+ } else if (arg_is(argv[i], "--always-on-top", nullptr)) {
+ opt_always_on_top = true;
+
} else if (arg_is(argv[i], "--unfiltered_log", nullptr)) {
unfiltered_log = true;
} else if (arg_is(argv[i], "--studio-mode", nullptr)) {
opt_studio_mode = true;
+ } else if (arg_is(argv[i], "--allow-opengl", nullptr)) {
+ opt_allow_opengl = true;
+
} else if (arg_is(argv[i], "--help", "-h")) {
std::cout <<
"--help, -h: Get list of available commands.\n\n" <<
"--minimize-to-tray: Minimize to system tray.\n" <<
"--portable, -p: Use portable mode.\n\n" <<
"--verbose: Make log more verbose.\n" <<
+ "--always-on-top: Start in 'always on top' mode.\n\n" <<
"--unfiltered_log: Make log unfiltered.\n\n" <<
+ "--allow-opengl: Allow OpenGL on Windows.\n\n" <<
"--version, -V: Get current version.\n";
exit(0);
obs-studio-18.0.1.tar.xz/UI/obs-app.hpp -> obs-studio-18.0.2.tar.xz/UI/obs-app.hpp
Changed
extern bool opt_start_replaybuffer;
extern bool opt_minimize_tray;
extern bool opt_studio_mode;
+extern bool opt_allow_opengl;
+extern bool opt_always_on_top;
extern std::string opt_starting_scene;
obs-studio-18.0.1.tar.xz/UI/obs-frontend-api/obs-frontend-api.cpp -> obs-studio-18.0.2.tar.xz/UI/obs-frontend-api/obs-frontend-api.cpp
Changed
if (callbacks_valid())
c->obs_frontend_pop_ui_translation();
}
+
+void obs_frontend_set_streaming_service(obs_service_t *service)
+{
+ if (callbacks_valid())
+ c->obs_frontend_set_streaming_service(service);
+}
+
+obs_service_t* obs_frontend_get_streaming_service(void)
+{
+ return !!callbacks_valid()
+ ? c->obs_frontend_get_streaming_service()
+ : nullptr;
+}
+
+void obs_frontend_save_streaming_service(void)
+{
+ if (callbacks_valid())
+ c->obs_frontend_save_streaming_service();
+}
obs-studio-18.0.1.tar.xz/UI/obs-frontend-api/obs-frontend-api.h -> obs-studio-18.0.2.tar.xz/UI/obs-frontend-api/obs-frontend-api.h
Changed
obs_frontend_translate_ui_cb translate);
EXPORT void obs_frontend_pop_ui_translation(void);
+EXPORT void obs_frontend_set_streaming_service(obs_service_t *service);
+EXPORT obs_service_t* obs_frontend_get_streaming_service(void);
+EXPORT void obs_frontend_save_streaming_service(void);
+
/* ------------------------------------------------------------------------- */
#ifdef __cplusplus
obs-studio-18.0.1.tar.xz/UI/obs-frontend-api/obs-frontend-internal.hpp -> obs-studio-18.0.2.tar.xz/UI/obs-frontend-api/obs-frontend-internal.hpp
Changed
obs_frontend_translate_ui_cb translate)=0;
virtual void obs_frontend_pop_ui_translation(void)=0;
+ virtual void obs_frontend_set_streaming_service(
+ obs_service_t *service)=0;
+ virtual obs_service_t *obs_frontend_get_streaming_service(void)=0;
+ virtual void obs_frontend_save_streaming_service()=0;
+
virtual void on_load(obs_data_t *settings)=0;
virtual void on_save(obs_data_t *settings)=0;
virtual void on_event(enum obs_frontend_event event)=0;
obs-studio-18.0.1.tar.xz/UI/properties-view.cpp -> obs-studio-18.0.2.tar.xz/UI/properties-view.cpp
Changed
QLineEdit *edit = new QLineEdit();
QPushButton *button = new QPushButton(QTStr("Browse"));
+ if (!obs_property_enabled(prop)) {
+ edit->setEnabled(false);
+ button->setEnabled(false);
+ }
+
edit->setText(QT_UTF8(val));
edit->setReadOnly(true);
edit->setToolTip(QT_UTF8(obs_property_long_description(prop)));
int val = (int)obs_data_get_int(settings, name);
QSpinBox *spin = new QSpinBox();
+ if (!obs_property_enabled(prop))
+ spin->setEnabled(false);
+
int minVal = obs_property_int_min(prop);
int maxVal = obs_property_int_max(prop);
int stepVal = obs_property_int_step(prop);
double val = obs_data_get_double(settings, name);
QDoubleSpinBox *spin = new QDoubleSpinBox();
+ if (!obs_property_enabled(prop))
+ spin->setEnabled(false);
+
double minVal = obs_property_float_min(prop);
double maxVal = obs_property_float_max(prop);
double stepVal = obs_property_float_step(prop);
QListWidget *list = new QListWidget();
size_t count = obs_data_array_count(array);
+ if (!obs_property_enabled(prop))
+ list->setEnabled(false);
+
list->setSortingEnabled(false);
list->setSelectionMode(QAbstractItemView::ExtendedSelection);
list->setToolTip(QT_UTF8(obs_property_long_description(prop)));
long long val = obs_data_get_int(settings, name);
QColor color = color_from_int(val);
+ if (!obs_property_enabled(prop)) {
+ button->setEnabled(false);
+ colorLabel->setEnabled(false);
+ }
+
button->setText(QTStr("Basic.PropertiesWindow.SelectColor"));
button->setToolTip(QT_UTF8(obs_property_long_description(prop)));
QLabel *fontLabel = new QLabel;
QFont font;
+ if (!obs_property_enabled(prop)) {
+ button->setEnabled(false);
+ fontLabel->setEnabled(false);
+ }
+
font = fontLabel->font();
MakeQFont(font_obj, font, true);
label->setAlignment(Qt::AlignRight | Qt::AlignVCenter);
}
+ if (label && !obs_property_enabled(property))
+ label->setEnabled(false);
+
if (!widget)
return;
obs-studio-18.0.1.tar.xz/UI/win-update/updater/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/UI/win-update/updater/CMakeLists.txt
Changed
return()
endif()
+if(DISABLE_UPDATE_MODULE)
+ return()
+endif()
+
if(NOT DEFINED STATIC_ZLIB_PATH OR "${STATIC_ZLIB_PATH}" STREQUAL "")
message(STATUS "STATIC_ZLIB_PATH not set, windows updater disabled")
return()
obs-studio-18.0.1.tar.xz/UI/win-update/updater/http.cpp -> obs-studio-18.0.2.tar.xz/UI/win-update/updater/http.cpp
Changed
static bool ReadZippedHTTPData(string &responseBuf, z_stream *strm,
string &zipBuf, const uint8_t *buffer, DWORD outSize)
{
- do {
- strm->avail_in = outSize;
- strm->next_in = buffer;
+ strm->avail_in = outSize;
+ strm->next_in = buffer;
+ do {
strm->avail_out = (uInt)zipBuf.size();
strm->next_out = (Bytef *)zipBuf.data();
string &zipBuf, const uint8_t *buffer, DWORD outSize,
int *responseCode)
{
- do {
- strm->avail_in = outSize;
- strm->next_in = buffer;
+ strm->avail_in = outSize;
+ strm->next_in = buffer;
+ do {
strm->avail_out = (uInt)zipBuf.size();
strm->next_out = (Bytef *)zipBuf.data();
obs-studio-18.0.1.tar.xz/UI/win-update/updater/updater.cpp -> obs-studio-18.0.2.tar.xz/UI/win-update/updater/updater.cpp
Changed
return false;
}
+static inline bool is_64bit_windows(void)
+{
+#ifdef _WIN64
+ return true;
+#else
+ BOOL x86 = false;
+ bool success = !!IsWow64Process(GetCurrentProcess(), &x86);
+ return success && !!x86;
+#endif
+}
+
+static inline bool is_64bit_file(const char *file)
+{
+ if (!file)
+ return false;
+
+ return strstr(file, "64bit") != nullptr ||
+ strstr(file, "64.dll") != nullptr ||
+ strstr(file, "64.exe") != nullptr;
+}
+
#define UTF8ToWideBuf(wide, utf8) UTF8ToWide(wide, _countof(wide), utf8)
#define WideToUTF8Buf(utf8, wide) WideToUTF8(utf8, _countof(utf8), wide)
json_t *name = json_object_get(package, "name");
json_t *files = json_object_get(package, "files");
+ bool isWin64 = is_64bit_windows();
+
if (!json_is_array(files))
return true;
if (!json_is_string(name))
if (strlen(hashUTF8) != BLAKE2_HASH_LENGTH * 2)
continue;
+ if (!isWin64 && is_64bit_file(fileUTF8))
+ continue;
+
/* convert strings to wide */
wchar_t sourceURL[1024];
StringCbPrintf(manifestPath, sizeof(manifestPath),
L"%s\\updates\\manifest.json", lpAppDataPath);
- if (!GetTempPathW(_countof(tempPath), tempPath)) {
+ if (!GetTempPathW(_countof(tempDirName), tempDirName)) {
Status(L"Update failed: Failed to get temp path: %ld",
GetLastError());
return false;
}
- if (!GetTempFileNameW(tempDirName, L"obs-studio", 0, tempDirName)) {
+ if (!GetTempFileNameW(tempDirName, L"obs-studio", 0, tempPath)) {
Status(L"Update failed: Failed to create temp dir name: %ld",
GetLastError());
return false;
}
- StringCbCat(tempPath, sizeof(tempPath), tempDirName);
+ DeleteFile(tempPath);
CreateDirectory(tempPath, nullptr);
/* ------------------------------------- *
* Send file hashes */
string newManifest;
- {
+
+ if (json_array_size(files) > 0) {
char *post_body = json_dumps(files, JSON_COMPACT);
int responseCode;
responseCode);
return false;
}
+ } else {
+ newManifest = "[]";
}
/* ------------------------------------- *
obs-studio-18.0.1.tar.xz/UI/window-basic-adv-audio.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-adv-audio.cpp
Changed
#include <QVBoxLayout>
+#include <QHBoxLayout>
#include <QGridLayout>
#include <QScrollArea>
+#include <QPushButton>
#include <QLabel>
#include "window-basic-adv-audio.hpp"
#include "window-basic-main.hpp"
scrollArea->setWidget(widget);
scrollArea->setWidgetResizable(true);
+ QPushButton *closeButton = new QPushButton(QTStr("Close"));
+
+ QHBoxLayout *buttonLayout = new QHBoxLayout;
+ buttonLayout->addStretch();
+ buttonLayout->addWidget(closeButton);
+
vlayout = new QVBoxLayout;
vlayout->setContentsMargins(11, 11, 11, 11);
vlayout->addWidget(scrollArea);
+ vlayout->addLayout(buttonLayout);
setLayout(vlayout);
+ connect(closeButton, &QPushButton::clicked, [this] () {close();});
+
installEventFilter(CreateShortcutFilter());
/* enum user scene/sources */
obs-studio-18.0.1.tar.xz/UI/window-basic-main-dropfiles.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-main-dropfiles.cpp
Changed
obs_data_t *settings = obs_data_create();
obs_source_t *source = nullptr;
const char *type = nullptr;
+ QString name;
switch (image) {
case DropType_RawText:
case DropType_Text:
obs_data_set_bool(settings, "read_from_file", true);
obs_data_set_string(settings, "file", data);
+ name = QUrl::fromLocalFile(QString(data)).fileName();
type = "text_gdiplus";
break;
case DropType_Image:
obs_data_set_string(settings, "file", data);
+ name = QUrl::fromLocalFile(QString(data)).fileName();
type = "image_source";
break;
case DropType_Media:
obs_data_set_string(settings, "local_file", data);
+ name = QUrl::fromLocalFile(QString(data)).fileName();
type = "ffmpeg_source";
break;
}
- const char *name = obs_source_get_display_name(type);
- source = obs_source_create(type, GenerateSourceName(name).c_str(),
+ if (name.isEmpty())
+ name = obs_source_get_display_name(type);
+ source = obs_source_create(type,
+ GenerateSourceName(QT_TO_UTF8(name)).c_str(),
settings, nullptr);
if (source) {
OBSScene scene = main->GetCurrentScene();
obs-studio-18.0.1.tar.xz/UI/window-basic-main-scene-collections.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-main-scene-collections.cpp
Changed
OBSBasic *main = reinterpret_cast<OBSBasic*>(App()->GetMainWindow());
main->OpenSavedProjectors();
+ main->ui->actionPasteFilters->setEnabled(false);
+ main->ui->actionPasteRef->setEnabled(false);
+ main->ui->actionPasteDup->setEnabled(false);
}
void OBSBasic::on_actionNewSceneCollection_triggered()
obs-studio-18.0.1.tar.xz/UI/window-basic-main.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-main.cpp
Changed
bool alwaysOnTop = config_get_bool(App()->GlobalConfig(), "BasicWindow",
"AlwaysOnTop");
- if (alwaysOnTop) {
+ if (alwaysOnTop || opt_always_on_top) {
SetAlwaysOnTop(this, true);
ui->actionAlwaysOnTop->setChecked(true);
}
QListWidgetItem *sel = nullptr;
int count = ui->scenes->count();
+
for (int i = 0; i < count; i++) {
auto item = ui->scenes->item(i);
auto cur_scene = GetOBSRef<OBSScene>(item);
bool OBSBasic::QueryRemoveSource(obs_source_t *source)
{
+ if (obs_source_get_type(source) == OBS_SOURCE_TYPE_SCENE) {
+ int count = ui->scenes->count();
+
+ if (count == 1) {
+ QMessageBox::information(this,
+ QTStr("FinalScene.Title"),
+ QTStr("FinalScene.Text"));
+ return false;
+ }
+ }
+
const char *name = obs_source_get_name(source);
QString text = QTStr("ConfirmRemove.Text");
#endif
}
-#ifdef __APPLE__
-#define VERSION_ENTRY "mac"
-#elif _WIN32
-#define VERSION_ENTRY "windows"
-#else
-#define VERSION_ENTRY "other"
-#endif
-
void OBSBasic::updateCheckFinished()
{
ui->actionCheckForUpdates->setEnabled(true);
{
if (event->type() == QEvent::WindowStateChange &&
isMinimized() &&
+ trayIcon &&
trayIcon->isVisible() &&
sysTrayMinimizeToTray()) {
if (addSourceMenu)
popup.addMenu(addSourceMenu);
+ ui->actionCopyFilters->setEnabled(false);
+
+ popup.addSeparator();
+ popup.addAction(ui->actionCopySource);
+ popup.addAction(ui->actionPasteRef);
+ popup.addAction(ui->actionPasteDup);
+ popup.addSeparator();
+
+ popup.addSeparator();
+ popup.addAction(ui->actionCopyFilters);
+ popup.addAction(ui->actionPasteFilters);
+ popup.addSeparator();
+
if (item) {
if (addSourceMenu)
popup.addSeparator();
SLOT(OpenFilters()));
popup.addAction(QTStr("Properties"), this,
SLOT(on_actionSourceProperties_triggered()));
+
+ ui->actionCopyFilters->setEnabled(true);
}
popup.exec(QCursor::pos());
return config_get_bool(GetGlobalConfig(),
"BasicWindow", "SysTrayMinimizeToTray");
}
+
+void OBSBasic::on_actionCopySource_triggered()
+{
+ on_actionCopyTransform_triggered();
+
+ OBSSceneItem item = GetCurrentSceneItem();
+
+ if (!item)
+ return;
+
+ OBSSource source = obs_sceneitem_get_source(item);
+
+ copyString = obs_source_get_name(source);
+ copyVisible = obs_sceneitem_visible(item);
+
+ ui->actionPasteRef->setEnabled(true);
+ ui->actionPasteDup->setEnabled(true);
+}
+
+void OBSBasic::on_actionPasteRef_triggered()
+{
+ OBSBasicSourceSelect::SourcePaste(copyString, copyVisible, false);
+ on_actionPasteTransform_triggered();
+}
+
+void OBSBasic::on_actionPasteDup_triggered()
+{
+ OBSBasicSourceSelect::SourcePaste(copyString, copyVisible, true);
+ on_actionPasteTransform_triggered();
+}
+
+void OBSBasic::on_actionCopyFilters_triggered()
+{
+ OBSSceneItem item = GetCurrentSceneItem();
+
+ if (!item)
+ return;
+
+ OBSSource source = obs_sceneitem_get_source(item);
+
+ copyFiltersString = obs_source_get_name(source);
+
+ ui->actionPasteFilters->setEnabled(true);
+}
+
+void OBSBasic::on_actionPasteFilters_triggered()
+{
+ OBSSource source = obs_get_source_by_name(copyFiltersString);
+ OBSSceneItem sceneItem = GetCurrentSceneItem();
+
+ OBSSource dstSource = obs_sceneitem_get_source(sceneItem);
+
+ if (source == dstSource)
+ return;
+
+ obs_source_copy_filters(dstSource, source);
+}
obs-studio-18.0.1.tar.xz/UI/window-basic-main.hpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-main.hpp
Changed
bool projectChanged = false;
bool previewEnabled = true;
+ const char *copyString;
+ const char *copyFiltersString;
+ bool copyVisible = true;
+
QPointer<QThread> updateCheckThread;
QPointer<QThread> logUploadThread;
void ToggleShowHide();
+ void on_actionCopySource_triggered();
+ void on_actionPasteRef_triggered();
+ void on_actionPasteDup_triggered();
+
+ void on_actionCopyFilters_triggered();
+ void on_actionPasteFilters_triggered();
+
private:
/* OBS Callbacks */
static void SceneReordered(void *data, calldata_t *params);
obs-studio-18.0.1.tar.xz/UI/window-basic-preview.hpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-preview.hpp
Changed
/* use libobs allocator for alignment because the matrices itemToScreen
* and screenToItem may contain SSE data, which will cause SSE
* instructions to crash if the data is not aligned to at least a 16
- * byte boundry. */
+ * byte boundary. */
static inline void* operator new(size_t size) {return bmalloc(size);}
static inline void operator delete(void* ptr) {bfree(ptr);}
};
obs-studio-18.0.1.tar.xz/UI/window-basic-settings.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-settings.cpp
Changed
"Renderer");
ui->renderer->addItem(QT_UTF8("Direct3D 11"));
- ui->renderer->addItem(QT_UTF8("OpenGL"));
+ if (opt_allow_opengl || strcmp(renderer, "OpenGL") == 0)
+ ui->renderer->addItem(QT_UTF8("OpenGL"));
int idx = ui->renderer->findText(QT_UTF8(renderer));
if (idx == -1)
idx = 0;
- if (strcmp(renderer, "OpenGL") == 0) {
- delete ui->adapter;
- delete ui->adapterLabel;
- ui->adapter = nullptr;
- ui->adapterLabel = nullptr;
- }
+ // the video adapter selection is not currently implemented, hide for now
+ // to avoid user confusion. was previously protected by
+ // if (strcmp(renderer, "OpenGL") == 0)
+ delete ui->adapter;
+ delete ui->adapterLabel;
+ ui->adapter = nullptr;
+ ui->adapterLabel = nullptr;
ui->renderer->setCurrentIndex(idx);
#endif
obs-studio-18.0.1.tar.xz/UI/window-basic-source-select.cpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-source-select.cpp
Changed
obs_sceneitem_set_visible(sceneitem, data->visible);
}
-static void AddExisting(const char *name, const bool visible)
+static char *get_new_source_name(const char *name)
+{
+ struct dstr new_name = {0};
+ int inc = 0;
+
+ dstr_copy(&new_name, name);
+
+ for (;;) {
+ obs_source_t *existing_source = obs_get_source_by_name(
+ new_name.array);
+ if (!existing_source)
+ break;
+
+ obs_source_release(existing_source);
+
+ dstr_printf(&new_name, "%s %d", name, ++inc + 1);
+ }
+
+ return new_name.array;
+}
+
+static void AddExisting(const char *name, bool visible, bool duplicate)
{
OBSBasic *main = reinterpret_cast<OBSBasic*>(App()->GetMainWindow());
OBSScene scene = main->GetCurrentScene();
obs_source_t *source = obs_get_source_by_name(name);
if (source) {
+ if (duplicate) {
+ obs_source_t *from = source;
+ char *new_name = get_new_source_name(name);
+ source = obs_source_duplicate(from, new_name, false);
+ bfree(new_name);
+ obs_source_release(from);
+
+ if (!source)
+ return;
+ }
+
AddSourceData data;
data.source = source;
data.visible = visible;
if (!item)
return;
- AddExisting(QT_TO_UTF8(item->text()), visible);
+ AddExisting(QT_TO_UTF8(item->text()), visible, false);
} else {
if (ui->sourceName->text().isEmpty()) {
QMessageBox::information(this,
obs_enum_sources(EnumSources, this);
}
}
+
+void OBSBasicSourceSelect::SourcePaste(const char *name, bool visible, bool dup)
+{
+ AddExisting(name, visible, dup);
+}
obs-studio-18.0.1.tar.xz/UI/window-basic-source-select.hpp -> obs-studio-18.0.2.tar.xz/UI/window-basic-source-select.hpp
Changed
OBSBasicSourceSelect(OBSBasic *parent, const char *id);
OBSSource newSource;
+
+ static void SourcePaste(const char *name, bool visible, bool duplicate);
};
obs-studio-18.0.1.tar.xz/UI/window-remux.cpp -> obs-studio-18.0.2.tar.xz/UI/window-remux.cpp
Changed
ui->setupUi(this);
ui->progressBar->setVisible(false);
- ui->remux->setEnabled(false);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setEnabled(false);
ui->targetFile->setEnabled(false);
ui->browseTarget->setEnabled(false);
[&]() { BrowseInput(); });
connect(ui->browseTarget, &QPushButton::clicked,
[&]() { BrowseOutput(); });
- connect(ui->remux, &QPushButton::clicked, [&]() { Remux(); });
connect(ui->sourceFile, &QLineEdit::textChanged,
this, &OBSRemux::inputChanged);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setText(QTStr("Remux.Remux"));
+
+ connect(ui->buttonBox->button(QDialogButtonBox::Ok),
+ SIGNAL(clicked()), this, SLOT(Remux()));
+
+ connect(ui->buttonBox->button(QDialogButtonBox::Close),
+ SIGNAL(clicked()), this, SLOT(close()));
+
worker->moveToThread(&remuxer);
remuxer.start();
void OBSRemux::inputChanged(const QString &path)
{
if (!QFileInfo::exists(path)) {
- ui->remux->setEnabled(false);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setEnabled(false);
return;
}
ui->sourceFile->setText(path);
- ui->remux->setEnabled(true);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setEnabled(true);
QFileInfo fi(path);
QString mp4 = fi.path() + "/" + fi.baseName() + ".mp4";
worker->lastProgress = 0.f;
ui->progressBar->setVisible(true);
- ui->remux->setEnabled(false);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setEnabled(false);
emit remux();
}
worker->job.reset();
ui->progressBar->setVisible(false);
- ui->remux->setEnabled(true);
+ ui->buttonBox->button(QDialogButtonBox::Ok)->
+ setEnabled(true);
}
RemuxWorker::RemuxWorker()
obs-studio-18.0.1.tar.xz/UI/window-remux.hpp -> obs-studio-18.0.2.tar.xz/UI/window-remux.hpp
Changed
void BrowseInput();
void BrowseOutput();
- void Remux();
bool Stop();
public slots:
void updateProgress(float percent);
void remuxFinished(bool success);
+ void Remux();
signals:
void remux();
obs-studio-18.0.1.tar.xz/deps/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/deps/CMakeLists.txt
Changed
add_subdirectory(glad)
add_subdirectory(ipc-util)
-add_subdirectory(libff)
+
+if(BUILD_LIBFF)
+ add_subdirectory(libff)
+endif()
+
+add_subdirectory(media-playback)
add_subdirectory(file-updater)
if(WIN32)
obs-studio-18.0.2.tar.xz/deps/jansson/test/suites/.gitattributes
Added
+api/ text=auto
+* text eol=lf
\ No newline at end of file
obs-studio-18.0.1.tar.xz/deps/libff/libff/ff-util.c -> obs-studio-18.0.2.tar.xz/deps/libff/libff/ff-util.c
Changed
void ff_init()
{
av_register_all();
- avdevice_register_all();
+ //avdevice_register_all();
avcodec_register_all();
avformat_network_init();
}
obs-studio-18.0.2.tar.xz/deps/media-playback
Added
+(directory)
obs-studio-18.0.2.tar.xz/deps/media-playback/CMakeLists.txt
Added
+project(media-playback)
+
+find_package(FFmpeg REQUIRED
+ COMPONENTS avcodec avdevice avutil avformat)
+
+include_directories(
+ ${CMAKE_SOURCE_DIR}/libobs
+ ${FFMPEG_INCLUDE_DIRS}
+ )
+
+set(media-playback_HEADERS
+ media-playback/decode.h
+ media-playback/media.h
+ )
+set(media-playback_SOURCES
+ media-playback/decode.c
+ media-playback/media.c
+ )
+
+add_library(media-playback STATIC
+ ${media-playback_HEADERS}
+ ${media-playback_SOURCES}
+ )
+target_include_directories(media-playback
+ PUBLIC .
+ )
+
+if(NOT MSVC)
+ if(NOT MINGW)
+ target_compile_options(media-playback PRIVATE -fPIC)
+ endif()
+endif()
+
+target_link_libraries(media-playback
+ ${FFMPEG_LIBRARIES}
+ )
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback
Added
+(directory)
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback/closest-format.h
Added
+/*
+ * Copyright (c) 2017 Hugh Bailey <obs.jim@gmail.com>
+ *
+ * Permission to use, copy, modify, and distribute this software for any
+ * purpose with or without fee is hereby granted, provided that the above
+ * copyright notice and this permission notice appear in all copies.
+ *
+ * THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+ */
+
+#pragma once
+
+static enum AVPixelFormat closest_format(enum AVPixelFormat fmt)
+{
+ switch (fmt) {
+ case AV_PIX_FMT_YUYV422:
+ return AV_PIX_FMT_YUYV422;
+
+ case AV_PIX_FMT_YUV422P:
+ case AV_PIX_FMT_YUVJ422P:
+ case AV_PIX_FMT_UYVY422:
+ case AV_PIX_FMT_YUV422P16LE:
+ case AV_PIX_FMT_YUV422P16BE:
+ case AV_PIX_FMT_YUV422P10BE:
+ case AV_PIX_FMT_YUV422P10LE:
+ case AV_PIX_FMT_YUV422P9BE:
+ case AV_PIX_FMT_YUV422P9LE:
+ case AV_PIX_FMT_YVYU422:
+ case AV_PIX_FMT_YUV422P12BE:
+ case AV_PIX_FMT_YUV422P12LE:
+ case AV_PIX_FMT_YUV422P14BE:
+ case AV_PIX_FMT_YUV422P14LE:
+ return AV_PIX_FMT_UYVY422;
+
+ case AV_PIX_FMT_NV12:
+ case AV_PIX_FMT_NV21:
+ return AV_PIX_FMT_NV12;
+
+ case AV_PIX_FMT_YUV420P:
+ case AV_PIX_FMT_YUVJ420P:
+ case AV_PIX_FMT_YUV411P:
+ case AV_PIX_FMT_UYYVYY411:
+ case AV_PIX_FMT_YUV410P:
+ case AV_PIX_FMT_YUV420P16LE:
+ case AV_PIX_FMT_YUV420P16BE:
+ case AV_PIX_FMT_YUV420P9BE:
+ case AV_PIX_FMT_YUV420P9LE:
+ case AV_PIX_FMT_YUV420P10BE:
+ case AV_PIX_FMT_YUV420P10LE:
+ case AV_PIX_FMT_YUV420P12BE:
+ case AV_PIX_FMT_YUV420P12LE:
+ case AV_PIX_FMT_YUV420P14BE:
+ case AV_PIX_FMT_YUV420P14LE:
+ return AV_PIX_FMT_YUV420P;
+
+ case AV_PIX_FMT_RGBA:
+ case AV_PIX_FMT_BGRA:
+ case AV_PIX_FMT_BGR0:
+ return fmt;
+
+ default:
+ break;
+ }
+
+ return AV_PIX_FMT_BGRA;
+}
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback/decode.c
Added
+/*
+ * Copyright (c) 2017 Hugh Bailey <obs.jim@gmail.com>
+ *
+ * Permission to use, copy, modify, and distribute this software for any
+ * purpose with or without fee is hereby granted, provided that the above
+ * copyright notice and this permission notice appear in all copies.
+ *
+ * THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+ */
+
+#include "decode.h"
+#include "media.h"
+
+static AVCodec *find_hardware_decoder(enum AVCodecID id)
+{
+ AVHWAccel *hwa = av_hwaccel_next(NULL);
+ AVCodec *c = NULL;
+
+ while (hwa) {
+ if (hwa->id == id) {
+ if (hwa->pix_fmt == AV_PIX_FMT_VDA_VLD ||
+ hwa->pix_fmt == AV_PIX_FMT_DXVA2_VLD ||
+ hwa->pix_fmt == AV_PIX_FMT_VAAPI_VLD) {
+ c = avcodec_find_decoder_by_name(hwa->name);
+ if (c)
+ break;
+ }
+ }
+
+ hwa = av_hwaccel_next(hwa);
+ }
+
+ return c;
+}
+
+static int mp_open_codec(struct mp_decode *d)
+{
+ AVCodecContext *c;
+ int ret;
+
+#if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 40, 101)
+ c = avcodec_alloc_context3(d->codec);
+ if (!c) {
+ blog(LOG_WARNING, "MP: Failed to allocate context");
+ return -1;
+ }
+
+ ret = avcodec_parameters_to_context(c, d->stream->codecpar);
+ if (ret < 0)
+ goto fail;
+#else
+ c = d->stream->codec;
+#endif
+
+ if (c->thread_count == 1 &&
+ c->codec_id != AV_CODEC_ID_PNG &&
+ c->codec_id != AV_CODEC_ID_TIFF &&
+ c->codec_id != AV_CODEC_ID_JPEG2000 &&
+ c->codec_id != AV_CODEC_ID_MPEG4 &&
+ c->codec_id != AV_CODEC_ID_WEBP)
+ c->thread_count = 0;
+
+ ret = avcodec_open2(c, d->codec, NULL);
+ if (ret < 0)
+ goto fail;
+
+ d->decoder = c;
+ return ret;
+
+fail:
+ avcodec_close(c);
+#if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 40, 101)
+ av_free(d->decoder);
+#endif
+ return ret;
+}
+
+bool mp_decode_init(mp_media_t *m, enum AVMediaType type, bool hw)
+{
+ struct mp_decode *d = type == AVMEDIA_TYPE_VIDEO ? &m->v : &m->a;
+ enum AVCodecID id;
+ AVStream *stream;
+ int ret;
+
+ memset(d, 0, sizeof(*d));
+ d->m = m;
+ d->audio = type == AVMEDIA_TYPE_AUDIO;
+
+ ret = av_find_best_stream(m->fmt, type, -1, -1, NULL, 0);
+ if (ret < 0)
+ return false;
+ stream = d->stream = m->fmt->streams[ret];
+
+#if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 40, 101)
+ id = stream->codecpar->codec_id;
+#else
+ id = stream->codec->codec_id;
+#endif
+
+ if (hw) {
+ d->codec = find_hardware_decoder(id);
+ if (d->codec) {
+ ret = mp_open_codec(d);
+ if (ret < 0)
+ d->codec = NULL;
+ }
+ }
+
+ if (!d->codec) {
+ if (id == AV_CODEC_ID_VP8)
+ d->codec = avcodec_find_decoder_by_name("libvpx");
+ else if (id == AV_CODEC_ID_VP9)
+ d->codec = avcodec_find_decoder_by_name("libvpx-vp9");
+
+ if (!d->codec)
+ d->codec = avcodec_find_decoder(id);
+ if (!d->codec) {
+ blog(LOG_WARNING, "MP: Failed to find %s codec",
+ av_get_media_type_string(type));
+ return false;
+ }
+
+ ret = mp_open_codec(d);
+ if (ret < 0) {
+ blog(LOG_WARNING, "MP: Failed to open %s decoder: %s",
+ av_get_media_type_string(type),
+ av_err2str(ret));
+ return false;
+ }
+ }
+
+ d->frame = av_frame_alloc();
+ if (!d->frame) {
+ blog(LOG_WARNING, "MP: Failed to allocate %s frame",
+ av_get_media_type_string(type));
+ return false;
+ }
+
+ if (d->codec->capabilities & CODEC_CAP_TRUNCATED)
+ d->decoder->flags |= CODEC_FLAG_TRUNCATED;
+ return true;
+}
+
+void mp_decode_clear_packets(struct mp_decode *d)
+{
+ if (d->packet_pending) {
+ av_packet_unref(&d->orig_pkt);
+ d->packet_pending = false;
+ }
+
+ while (d->packets.size) {
+ AVPacket pkt;
+ circlebuf_pop_front(&d->packets, &pkt, sizeof(pkt));
+ av_packet_unref(&pkt);
+ }
+}
+
+void mp_decode_free(struct mp_decode *d)
+{
+ mp_decode_clear_packets(d);
+ circlebuf_free(&d->packets);
+
+ if (d->decoder) {
+ avcodec_close(d->decoder);
+#if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 40, 101)
+ av_free(d->decoder);
+#endif
+ }
+
+ if (d->frame)
+ av_free(d->frame);
+
+ memset(d, 0, sizeof(*d));
+}
+
+void mp_decode_push_packet(struct mp_decode *decode, AVPacket *packet)
+{
+ circlebuf_push_back(&decode->packets, packet, sizeof(*packet));
+}
+
+static inline int64_t get_estimated_duration(struct mp_decode *d,
+ int64_t last_pts)
+{
+ if (last_pts)
+ return d->frame_pts - last_pts;
+
+ if (d->audio) {
+ return av_rescale_q(d->frame->nb_samples,
+ (AVRational){1, d->frame->sample_rate},
+ (AVRational){1, 1000000000});
+ } else {
+ if (d->last_duration)
+ return d->last_duration;
+
+ return av_rescale_q(d->decoder->time_base.num,
+ d->decoder->time_base,
+ (AVRational){1, 1000000000});
+ }
+}
+
+bool mp_decode_next(struct mp_decode *d)
+{
+ bool eof = d->m->eof;
+ int got_frame;
+ int ret;
+
+ d->frame_ready = false;
+
+ if (!eof && !d->packets.size)
+ return true;
+
+ while (!d->frame_ready) {
+ if (!d->packet_pending) {
+ if (!d->packets.size) {
+ if (eof) {
+ d->pkt.data = NULL;
+ d->pkt.size = 0;
+ } else {
+ return true;
+ }
+ } else {
+ circlebuf_pop_front(&d->packets, &d->orig_pkt,
+ sizeof(d->orig_pkt));
+ d->pkt = d->orig_pkt;
+ d->packet_pending = true;
+ }
+ }
+
+ if (d->audio) {
+ ret = avcodec_decode_audio4(d->decoder,
+ d->frame, &got_frame, &d->pkt);
+ } else {
+ if (d->m->is_network && !d->got_first_keyframe) {
+ if (d->pkt.flags & AV_PKT_FLAG_KEY) {
+ d->got_first_keyframe = true;
+ } else {
+ av_packet_unref(&d->orig_pkt);
+ av_init_packet(&d->orig_pkt);
+ av_init_packet(&d->pkt);
+ d->packet_pending = false;
+ return true;
+ }
+ }
+
+ ret = avcodec_decode_video2(d->decoder,
+ d->frame, &got_frame, &d->pkt);
+ }
+ if (!got_frame && ret == 0) {
+ d->eof = true;
+ return true;
+ }
+ if (ret < 0) {
+ blog(LOG_DEBUG, "MP: decode failed: %s",
+ av_err2str(ret));
+
+ if (d->packet_pending) {
+ av_packet_unref(&d->orig_pkt);
+ av_init_packet(&d->orig_pkt);
+ av_init_packet(&d->pkt);
+ d->packet_pending = false;
+ }
+ return true;
+ }
+
+ d->frame_ready = !!got_frame;
+
+ if (d->packet_pending) {
+ if (d->pkt.size) {
+ d->pkt.data += ret;
+ d->pkt.size -= ret;
+ }
+
+ if (d->pkt.size == 0) {
+ av_packet_unref(&d->orig_pkt);
+ av_init_packet(&d->orig_pkt);
+ av_init_packet(&d->pkt);
+ d->packet_pending = false;
+ }
+ }
+ }
+
+ if (d->frame_ready) {
+ int64_t last_pts = d->frame_pts;
+
+ d->frame_pts = av_rescale_q(d->frame->best_effort_timestamp,
+ d->stream->time_base,
+ (AVRational){1, 1000000000});
+
+ int64_t duration = d->frame->pkt_duration;
+ if (!duration)
+ duration = get_estimated_duration(d, last_pts);
+ else
+ duration = av_rescale_q(duration,
+ d->stream->time_base,
+ (AVRational){1, 1000000000});
+ d->last_duration = duration;
+ d->next_pts = d->frame_pts + duration;
+ }
+
+ return true;
+}
+
+void mp_decode_flush(struct mp_decode *d)
+{
+ avcodec_flush_buffers(d->decoder);
+ mp_decode_clear_packets(d);
+ d->eof = false;
+ d->frame_pts = 0;
+ d->frame_ready = false;
+}
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback/decode.h
Added
+/*
+ * Copyright (c) 2017 Hugh Bailey <obs.jim@gmail.com>
+ *
+ * Permission to use, copy, modify, and distribute this software for any
+ * purpose with or without fee is hereby granted, provided that the above
+ * copyright notice and this permission notice appear in all copies.
+ *
+ * THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+ */
+
+#pragma once
+
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+#include <util/circlebuf.h>
+
+#ifdef _MSC_VER
+#pragma warning(push)
+#pragma warning(disable : 4244)
+#pragma warning(disable : 4204)
+#endif
+
+#include <libavformat/avformat.h>
+#include <libavcodec/avcodec.h>
+#include <util/threading.h>
+
+#ifdef _MSC_VER
+#pragma warning(pop)
+#endif
+
+struct mp_media;
+
+struct mp_decode {
+ struct mp_media *m;
+ AVStream *stream;
+ bool audio;
+
+ AVCodecContext *decoder;
+ AVCodec *codec;
+
+ int64_t last_duration;
+ int64_t frame_pts;
+ int64_t next_pts;
+ AVFrame *frame;
+ bool got_first_keyframe;
+ bool frame_ready;
+ bool eof;
+
+ AVPacket orig_pkt;
+ AVPacket pkt;
+ bool packet_pending;
+ struct circlebuf packets;
+};
+
+extern bool mp_decode_init(struct mp_media *media, enum AVMediaType type,
+ bool hw);
+extern void mp_decode_free(struct mp_decode *decode);
+
+extern void mp_decode_clear_packets(struct mp_decode *decode);
+
+extern void mp_decode_push_packet(struct mp_decode *decode, AVPacket *pkt);
+extern bool mp_decode_next(struct mp_decode *decode);
+extern void mp_decode_flush(struct mp_decode *decode);
+
+#ifdef __cplusplus
+}
+#endif
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback/media.c
Added
+/*
+ * Copyright (c) 2017 Hugh Bailey <obs.jim@gmail.com>
+ *
+ * Permission to use, copy, modify, and distribute this software for any
+ * purpose with or without fee is hereby granted, provided that the above
+ * copyright notice and this permission notice appear in all copies.
+ *
+ * THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+ */
+
+#include <obs.h>
+#include <util/platform.h>
+
+#include <assert.h>
+
+#include "media.h"
+#include "closest-format.h"
+
+#include <libavdevice/avdevice.h>
+#include <libavutil/imgutils.h>
+
+static int64_t base_sys_ts = 0;
+
+static inline enum video_format convert_pixel_format(int f)
+{
+ switch (f) {
+ case AV_PIX_FMT_NONE: return VIDEO_FORMAT_NONE;
+ case AV_PIX_FMT_YUV420P: return VIDEO_FORMAT_I420;
+ case AV_PIX_FMT_NV12: return VIDEO_FORMAT_NV12;
+ case AV_PIX_FMT_YUYV422: return VIDEO_FORMAT_YUY2;
+ case AV_PIX_FMT_UYVY422: return VIDEO_FORMAT_UYVY;
+ case AV_PIX_FMT_RGBA: return VIDEO_FORMAT_RGBA;
+ case AV_PIX_FMT_BGRA: return VIDEO_FORMAT_BGRA;
+ case AV_PIX_FMT_BGR0: return VIDEO_FORMAT_BGRX;
+ default:;
+ }
+
+ return VIDEO_FORMAT_NONE;
+}
+
+static inline enum audio_format convert_sample_format(int f)
+{
+ switch (f) {
+ case AV_SAMPLE_FMT_U8: return AUDIO_FORMAT_U8BIT;
+ case AV_SAMPLE_FMT_S16: return AUDIO_FORMAT_16BIT;
+ case AV_SAMPLE_FMT_S32: return AUDIO_FORMAT_32BIT;
+ case AV_SAMPLE_FMT_FLT: return AUDIO_FORMAT_FLOAT;
+ case AV_SAMPLE_FMT_U8P: return AUDIO_FORMAT_U8BIT_PLANAR;
+ case AV_SAMPLE_FMT_S16P: return AUDIO_FORMAT_16BIT_PLANAR;
+ case AV_SAMPLE_FMT_S32P: return AUDIO_FORMAT_32BIT_PLANAR;
+ case AV_SAMPLE_FMT_FLTP: return AUDIO_FORMAT_FLOAT_PLANAR;
+ default:;
+ }
+
+ return AUDIO_FORMAT_UNKNOWN;
+}
+
+static inline enum video_colorspace convert_color_space(enum AVColorSpace s)
+{
+ return s == AVCOL_SPC_BT709 ? VIDEO_CS_709 : VIDEO_CS_DEFAULT;
+}
+
+static inline enum video_range_type convert_color_range(enum AVColorRange r)
+{
+ return r == AVCOL_RANGE_JPEG ? VIDEO_RANGE_FULL : VIDEO_RANGE_DEFAULT;
+}
+
+static inline struct mp_decode *get_packet_decoder(mp_media_t *media,
+ AVPacket *pkt)
+{
+ if (media->has_audio && pkt->stream_index == media->a.stream->index)
+ return &media->a;
+ if (media->has_video && pkt->stream_index == media->v.stream->index)
+ return &media->v;
+
+ return NULL;
+}
+
+static int mp_media_next_packet(mp_media_t *media)
+{
+ AVPacket new_pkt;
+ AVPacket pkt;
+ av_init_packet(&pkt);
+ new_pkt = pkt;
+
+ int ret = av_read_frame(media->fmt, &pkt);
+ if (ret < 0) {
+ if (ret != AVERROR_EOF)
+ blog(LOG_WARNING, "MP: av_read_frame failed: %d", ret);
+ return ret;
+ }
+
+ struct mp_decode *d = get_packet_decoder(media, &pkt);
+ if (d && pkt.size) {
+ av_packet_ref(&new_pkt, &pkt);
+ mp_decode_push_packet(d, &new_pkt);
+ }
+
+ av_packet_unref(&pkt);
+ return ret;
+}
+
+static inline bool mp_media_ready_to_start(mp_media_t *m)
+{
+ if (m->has_audio && !m->a.eof && !m->a.frame_ready)
+ return false;
+ if (m->has_video && !m->v.eof && !m->v.frame_ready)
+ return false;
+ return true;
+}
+
+static inline bool mp_decode_frame(struct mp_decode *d)
+{
+ return d->frame_ready || mp_decode_next(d);
+}
+
+static inline int get_sws_colorspace(enum AVColorSpace cs)
+{
+ switch (cs) {
+ case AVCOL_SPC_BT709:
+ return SWS_CS_ITU709;
+ case AVCOL_SPC_FCC:
+ return SWS_CS_FCC;
+ case AVCOL_SPC_SMPTE170M:
+ return SWS_CS_SMPTE170M;
+ case AVCOL_SPC_SMPTE240M:
+ return SWS_CS_SMPTE240M;
+ default:
+ break;
+ }
+
+ return SWS_CS_ITU601;
+}
+
+static inline int get_sws_range(enum AVColorRange r)
+{
+ return r == AVCOL_RANGE_JPEG ? 1 : 0;
+}
+
+#define FIXED_1_0 (1<<16)
+
+static bool mp_media_init_scaling(mp_media_t *m)
+{
+ int space = get_sws_colorspace(m->v.decoder->colorspace);
+ int range = get_sws_range(m->v.decoder->color_range);
+ const int *coeff = sws_getCoefficients(space);
+
+ m->swscale = sws_getCachedContext(NULL,
+ m->v.decoder->width, m->v.decoder->height,
+ m->v.decoder->pix_fmt,
+ m->v.decoder->width, m->v.decoder->height,
+ m->scale_format,
+ SWS_FAST_BILINEAR, NULL, NULL, NULL);
+ if (!m->swscale) {
+ blog(LOG_WARNING, "MP: Failed to initialize scaler");
+ return false;
+ }
+
+ sws_setColorspaceDetails(m->swscale, coeff, range, coeff, range, 0,
+ FIXED_1_0, FIXED_1_0);
+
+ int ret = av_image_alloc(m->scale_pic, m->scale_linesizes,
+ m->v.decoder->width, m->v.decoder->height,
+ m->scale_format, 1);
+ if (ret < 0) {
+ blog(LOG_WARNING, "MP: Failed to create scale pic data");
+ return false;
+ }
+
+ return true;
+}
+
+static bool mp_media_prepare_frames(mp_media_t *m)
+{
+ while (!mp_media_ready_to_start(m)) {
+ if (!m->eof) {
+ int ret = mp_media_next_packet(m);
+ if (ret == AVERROR_EOF)
+ m->eof = true;
+ else if (ret < 0)
+ return false;
+ }
+
+ if (m->has_video && !mp_decode_frame(&m->v))
+ return false;
+ if (m->has_audio && !mp_decode_frame(&m->a))
+ return false;
+ }
+
+ if (m->has_video && m->v.frame_ready && !m->swscale) {
+ m->scale_format = closest_format(m->v.frame->format);
+ if (m->scale_format != m->v.frame->format) {
+ if (!mp_media_init_scaling(m)) {
+ return false;
+ }
+ }
+ }
+
+ return true;
+}
+
+static inline int64_t mp_media_get_next_min_pts(mp_media_t *m)
+{
+ int64_t min_next_ns = 0x7FFFFFFFFFFFFFFFLL;
+
+ if (m->has_video && m->v.frame_ready) {
+ if (m->v.frame_pts < min_next_ns)
+ min_next_ns = m->v.frame_pts;
+ }
+ if (m->has_audio && m->a.frame_ready) {
+ if (m->a.frame_pts < min_next_ns)
+ min_next_ns = m->a.frame_pts;
+ }
+
+ return min_next_ns;
+}
+
+static inline int64_t mp_media_get_base_pts(mp_media_t *m)
+{
+ int64_t base_ts = 0;
+
+ if (m->has_video && m->v.next_pts > base_ts)
+ base_ts = m->v.next_pts;
+ if (m->has_audio && m->a.next_pts > base_ts)
+ base_ts = m->a.next_pts;
+
+ return base_ts;
+}
+
+static inline bool mp_media_can_play_frame(mp_media_t *m,
+ struct mp_decode *d)
+{
+ return d->frame_ready && d->frame_pts <= m->next_pts_ns;
+}
+
+static void mp_media_next_audio(mp_media_t *m)
+{
+ struct mp_decode *d = &m->a;
+ struct obs_source_audio audio = {0};
+ AVFrame *f = d->frame;
+
+ if (!mp_media_can_play_frame(m, d))
+ return;
+
+ d->frame_ready = false;
+ if (!m->a_cb)
+ return;
+
+ for (size_t i = 0; i < MAX_AV_PLANES; i++)
+ audio.data[i] = f->data[i];
+
+ audio.samples_per_sec = f->sample_rate;
+ audio.speakers = (enum speaker_layout)f->channels;
+ audio.format = convert_sample_format(f->format);
+ audio.frames = f->nb_samples;
+ audio.timestamp = m->base_ts + d->frame_pts - m->start_ts +
+ m->play_sys_ts - base_sys_ts;
+
+ if (audio.format == AUDIO_FORMAT_UNKNOWN)
+ return;
+
+ m->a_cb(m->opaque, &audio);
+}
+
+static void mp_media_next_video(mp_media_t *m, bool preload)
+{
+ struct mp_decode *d = &m->v;
+ struct obs_source_frame *frame = &m->obsframe;
+ enum video_format new_format;
+ enum video_colorspace new_space;
+ enum video_range_type new_range;
+ AVFrame *f = d->frame;
+
+ if (!preload) {
+ if (!mp_media_can_play_frame(m, d))
+ return;
+
+ d->frame_ready = false;
+
+ if (!m->v_cb)
+ return;
+ } else if (!d->frame_ready) {
+ return;
+ }
+
+ if (m->swscale) {
+ int ret = sws_scale(m->swscale,
+ (const uint8_t *const *)f->data, f->linesize,
+ 0, f->height,
+ m->scale_pic, m->scale_linesizes);
+ if (ret < 0)
+ return;
+
+ for (size_t i = 0; i < 4; i++) {
+ frame->data[i] = m->scale_pic[i];
+ frame->linesize[i] = m->scale_linesizes[i];
+ }
+ } else {
+ for (size_t i = 0; i < MAX_AV_PLANES; i++) {
+ frame->data[i] = f->data[i];
+ frame->linesize[i] = f->linesize[i];
+ }
+ }
+
+ new_format = convert_pixel_format(m->scale_format);
+ new_space = convert_color_space(f->colorspace);
+ new_range = m->force_range == VIDEO_RANGE_DEFAULT
+ ? convert_color_range(f->color_range)
+ : m->force_range;
+
+ if (new_format != frame->format ||
+ new_space != m->cur_space ||
+ new_range != m->cur_range) {
+ bool success;
+
+ frame->format = new_format;
+ frame->full_range = new_range == VIDEO_RANGE_FULL;
+
+ success = video_format_get_parameters(
+ new_space,
+ new_range,
+ frame->color_matrix,
+ frame->color_range_min,
+ frame->color_range_max);
+
+ frame->format = new_format;
+ m->cur_space = new_space;
+ m->cur_range = new_range;
+
+ if (!success) {
+ frame->format = VIDEO_FORMAT_NONE;
+ return;
+ }
+ }
+
+ if (frame->format == VIDEO_FORMAT_NONE)
+ return;
+
+ frame->timestamp = m->base_ts + d->frame_pts - m->start_ts +
+ m->play_sys_ts - base_sys_ts;
+ frame->width = f->width;
+ frame->height = f->height;
+ frame->flip = false;
+
+ if (preload)
+ m->v_preload_cb(m->opaque, frame);
+ else
+ m->v_cb(m->opaque, frame);
+}
+
+static void mp_media_calc_next_ns(mp_media_t *m)
+{
+ int64_t min_next_ns = mp_media_get_next_min_pts(m);
+
+ int64_t delta = min_next_ns - m->next_pts_ns;
+#ifdef _DEBUG
+ assert(delta >= 0);
+#endif
+ if (delta < 0)
+ delta = 0;
+ if (delta > 3000000000)
+ delta = 0;
+
+ m->next_ns += delta;
+ m->next_pts_ns = min_next_ns;
+}
+
+static bool mp_media_reset(mp_media_t *m)
+{
+ AVStream *stream = m->fmt->streams[0];
+ int64_t seek_pos;
+ int seek_flags;
+ bool stopping;
+ bool active;
+
+ if (m->fmt->duration == AV_NOPTS_VALUE) {
+ seek_pos = 0;
+ seek_flags = AVSEEK_FLAG_FRAME;
+ } else {
+ seek_pos = m->fmt->start_time;
+ seek_flags = AVSEEK_FLAG_BACKWARD;
+ }
+
+ int64_t seek_target = seek_flags == AVSEEK_FLAG_BACKWARD
+ ? av_rescale_q(seek_pos, AV_TIME_BASE_Q, stream->time_base)
+ : seek_pos;
+
+ int ret = av_seek_frame(m->fmt, 0, seek_target, seek_flags);
+ if (ret < 0) {
+ blog(LOG_WARNING, "MP: Failed to seek: %s", av_err2str(ret));
+ return false;
+ }
+
+ if (m->has_video && !m->is_network)
+ mp_decode_flush(&m->v);
+ if (m->has_audio && !m->is_network)
+ mp_decode_flush(&m->a);
+
+ int64_t next_ts = mp_media_get_base_pts(m);
+ int64_t offset = next_ts - m->next_pts_ns;
+
+ m->eof = false;
+ m->base_ts += next_ts;
+
+ if (!mp_media_prepare_frames(m))
+ return false;
+
+ pthread_mutex_lock(&m->mutex);
+ stopping = m->stopping;
+ active = m->active;
+ m->stopping = false;
+ pthread_mutex_unlock(&m->mutex);
+
+ if (active) {
+ if (!m->play_sys_ts)
+ m->play_sys_ts = (int64_t)os_gettime_ns();
+ m->start_ts = m->next_pts_ns = mp_media_get_next_min_pts(m);
+ if (m->next_ns)
+ m->next_ns += offset;
+ } else {
+ m->start_ts = m->next_pts_ns = mp_media_get_next_min_pts(m);
+ m->play_sys_ts = (int64_t)os_gettime_ns();
+ m->next_ns = 0;
+ }
+
+ if (!active && !m->is_network && m->v_preload_cb)
+ mp_media_next_video(m, true);
+ if (stopping && m->stop_cb)
+ m->stop_cb(m->opaque);
+ return true;
+}
+
+static inline void mp_media_sleepto(mp_media_t *m)
+{
+ if (!m->next_ns)
+ m->next_ns = os_gettime_ns();
+ else
+ os_sleepto_ns(m->next_ns);
+}
+
+static inline bool mp_media_eof(mp_media_t *m)
+{
+ bool v_ended = !m->has_video || !m->v.frame_ready;
+ bool a_ended = !m->has_audio || !m->a.frame_ready;
+ bool eof = v_ended && a_ended;
+
+ if (eof) {
+ bool looping;
+
+ pthread_mutex_lock(&m->mutex);
+ looping = m->looping;
+ if (!looping) {
+ m->active = false;
+ m->stopping = true;
+ }
+ pthread_mutex_unlock(&m->mutex);
+
+ mp_media_reset(m);
+ }
+
+ return eof;
+}
+
+static void *mp_media_thread(void *opaque)
+{
+ mp_media_t *m = opaque;
+
+ os_set_thread_name("mp_media_thread");
+
+ mp_media_reset(m);
+
+ for (;;) {
+ bool reset, kill, is_active;
+
+ pthread_mutex_lock(&m->mutex);
+ is_active = m->active;
+ pthread_mutex_unlock(&m->mutex);
+
+ if (!is_active) {
+ if (os_sem_wait(m->sem) < 0)
+ return NULL;
+ } else {
+ mp_media_sleepto(m);
+ }
+
+ pthread_mutex_lock(&m->mutex);
+
+ reset = m->reset;
+ kill = m->kill;
+ m->reset = false;
+ m->kill = false;
+
+ pthread_mutex_unlock(&m->mutex);
+
+ if (kill) {
+ break;
+ }
+ if (reset) {
+ mp_media_reset(m);
+ continue;
+ }
+
+ /* frames are ready */
+ if (is_active) {
+ if (m->has_video)
+ mp_media_next_video(m, false);
+ if (m->has_audio)
+ mp_media_next_audio(m);
+
+ if (!mp_media_prepare_frames(m))
+ return NULL;
+ if (mp_media_eof(m))
+ continue;
+
+ mp_media_calc_next_ns(m);
+ }
+ }
+
+ return NULL;
+}
+
+static inline bool mp_media_init_internal(mp_media_t *m,
+ const char *path,
+ const char *format_name,
+ bool hw)
+{
+ AVInputFormat *format = NULL;
+
+ if (pthread_mutex_init(&m->mutex, NULL) != 0) {
+ blog(LOG_WARNING, "MP: Failed to init mutex");
+ return false;
+ }
+ if (os_sem_init(&m->sem, 0) != 0) {
+ blog(LOG_WARNING, "MP: Failed to init semaphore");
+ return false;
+ }
+
+ if (format_name && *format_name) {
+ format = av_find_input_format(format_name);
+ if (!format)
+ blog(LOG_INFO, "MP: Unable to find input format for "
+ "'%s'", path);
+ }
+
+ int ret = avformat_open_input(&m->fmt, path, format, NULL);
+ if (ret < 0) {
+ blog(LOG_WARNING, "MP: Failed to open media: '%s'", path);
+ return false;
+ }
+
+ if (avformat_find_stream_info(m->fmt, NULL) < 0) {
+ blog(LOG_WARNING, "MP: Failed to find stream info for '%s'",
+ path);
+ return false;
+ }
+
+ m->has_video = mp_decode_init(m, AVMEDIA_TYPE_VIDEO, hw);
+ m->has_audio = mp_decode_init(m, AVMEDIA_TYPE_AUDIO, hw);
+
+ if (!m->has_video && !m->has_audio) {
+ blog(LOG_WARNING, "MP: Could not initialize audio or video: "
+ "'%s'", path);
+ return false;
+ }
+
+ if (pthread_create(&m->thread, NULL, mp_media_thread, m) != 0) {
+ blog(LOG_WARNING, "MP: Could not create media thread");
+ return false;
+ }
+
+ m->thread_valid = true;
+ return true;
+}
+
+bool mp_media_init(mp_media_t *media,
+ const char *path,
+ const char *format,
+ void *opaque,
+ mp_video_cb v_cb,
+ mp_audio_cb a_cb,
+ mp_stop_cb stop_cb,
+ mp_video_cb v_preload_cb,
+ bool hw_decoding,
+ enum video_range_type force_range)
+{
+ memset(media, 0, sizeof(*media));
+ pthread_mutex_init_value(&media->mutex);
+ media->opaque = opaque;
+ media->v_cb = v_cb;
+ media->a_cb = a_cb;
+ media->stop_cb = stop_cb;
+ media->v_preload_cb = v_preload_cb;
+ media->force_range = force_range;
+
+ if (path && *path)
+ media->is_network = !!strstr(path, "://");
+
+ static bool initialized = false;
+ if (!initialized) {
+ av_register_all();
+ avdevice_register_all();
+ avcodec_register_all();
+ avformat_network_init();
+ initialized = true;
+ }
+
+ if (!base_sys_ts)
+ base_sys_ts = (int64_t)os_gettime_ns();
+
+ if (!mp_media_init_internal(media, path, format, hw_decoding)) {
+ mp_media_free(media);
+ return false;
+ }
+
+ return true;
+}
+
+static void mp_kill_thread(mp_media_t *m)
+{
+ if (m->thread_valid) {
+ pthread_mutex_lock(&m->mutex);
+ m->kill = true;
+ pthread_mutex_unlock(&m->mutex);
+ os_sem_post(m->sem);
+
+ pthread_join(m->thread, NULL);
+ }
+}
+
+void mp_media_free(mp_media_t *media)
+{
+ if (!media)
+ return;
+
+ mp_media_stop(media);
+ mp_kill_thread(media);
+ mp_decode_free(&media->v);
+ mp_decode_free(&media->a);
+ pthread_mutex_destroy(&media->mutex);
+ os_sem_destroy(media->sem);
+ avformat_close_input(&media->fmt);
+ sws_freeContext(media->swscale);
+ av_freep(&media->scale_pic[0]);
+ memset(media, 0, sizeof(*media));
+ pthread_mutex_init_value(&media->mutex);
+}
+
+void mp_media_play(mp_media_t *m, bool loop)
+{
+ pthread_mutex_lock(&m->mutex);
+
+ if (m->active)
+ m->reset = true;
+
+ m->looping = loop;
+ m->active = true;
+
+ pthread_mutex_unlock(&m->mutex);
+
+ os_sem_post(m->sem);
+}
+
+void mp_media_stop(mp_media_t *m)
+{
+ pthread_mutex_lock(&m->mutex);
+ if (m->active) {
+ m->reset = true;
+ m->active = false;
+ m->stopping = true;
+ os_sem_post(m->sem);
+ }
+ pthread_mutex_unlock(&m->mutex);
+}
obs-studio-18.0.2.tar.xz/deps/media-playback/media-playback/media.h
Added
+/*
+ * Copyright (c) 2017 Hugh Bailey <obs.jim@gmail.com>
+ *
+ * Permission to use, copy, modify, and distribute this software for any
+ * purpose with or without fee is hereby granted, provided that the above
+ * copyright notice and this permission notice appear in all copies.
+ *
+ * THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+ */
+
+#pragma once
+
+#include <obs.h>
+#include "decode.h"
+
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+#ifdef _MSC_VER
+#pragma warning(push)
+#pragma warning(disable : 4244)
+#pragma warning(disable : 4204)
+#endif
+
+#include <libavformat/avformat.h>
+#include <libavcodec/avcodec.h>
+#include <libswscale/swscale.h>
+#include <util/threading.h>
+
+#ifdef _MSC_VER
+#pragma warning(pop)
+#endif
+
+typedef void (*mp_video_cb)(void *opaque, struct obs_source_frame *frame);
+typedef void (*mp_audio_cb)(void *opaque, struct obs_source_audio *audio);
+typedef void (*mp_stop_cb)(void *opaque);
+
+struct mp_media {
+ AVFormatContext *fmt;
+
+ mp_video_cb v_preload_cb;
+ mp_stop_cb stop_cb;
+ mp_video_cb v_cb;
+ mp_audio_cb a_cb;
+ void *opaque;
+
+ enum AVPixelFormat scale_format;
+ struct SwsContext *swscale;
+ int scale_linesizes[4];
+ uint8_t *scale_pic[4];
+
+ struct mp_decode v;
+ struct mp_decode a;
+ bool is_network;
+ bool has_video;
+ bool has_audio;
+ bool is_file;
+ bool eof;
+
+ struct obs_source_frame obsframe;
+ enum video_colorspace cur_space;
+ enum video_range_type cur_range;
+ enum video_range_type force_range;
+
+ int64_t play_sys_ts;
+ int64_t next_pts_ns;
+ uint64_t next_ns;
+ int64_t start_ts;
+ int64_t base_ts;
+
+ pthread_mutex_t mutex;
+ os_sem_t *sem;
+ bool stopping;
+ bool looping;
+ bool active;
+ bool reset;
+ bool kill;
+
+ bool thread_valid;
+ pthread_t thread;
+};
+
+typedef struct mp_media mp_media_t;
+
+extern bool mp_media_init(mp_media_t *media,
+ const char *path,
+ const char *format,
+ void *opaque,
+ mp_video_cb v_cb,
+ mp_audio_cb a_cb,
+ mp_stop_cb stop_cb,
+ mp_video_cb v_preload_cb,
+ bool hardware_decoding,
+ enum video_range_type force_range);
+extern void mp_media_free(mp_media_t *media);
+
+extern void mp_media_play(mp_media_t *media, bool loop);
+extern void mp_media_stop(mp_media_t *media);
+
+#ifdef __cplusplus
+}
+#endif
obs-studio-18.0.1.tar.xz/libobs-d3d11/d3d11-subsystem.cpp -> obs-studio-18.0.2.tar.xz/libobs-d3d11/d3d11-subsystem.cpp
Changed
if (FAILED(hr))
throw UnsupportedHWError("Failed to create device", hr);
- blog(LOG_INFO, "D3D11 loaded sucessfully, feature level used: %u",
+ blog(LOG_INFO, "D3D11 loaded successfully, feature level used: %u",
(unsigned int)levelUsed);
}
if (FAILED(hr))
continue;
- /* ignore microsoft's 'basic' renderer' */
+ /* ignore Microsoft's 'basic' renderer' */
if (desc.VendorId == 0x1414 && desc.DeviceId == 0x8c)
continue;
if (FAILED(hr))
continue;
- /* ignore microsoft's 'basic' renderer' */
+ /* ignore Microsoft's 'basic' renderer' */
if (desc.VendorId == 0x1414 && desc.DeviceId == 0x8c)
continue;
try {
blog(LOG_INFO, "---------------------------------");
- blog(LOG_INFO, "Initializing D3D11..");
+ blog(LOG_INFO, "Initializing D3D11...");
LogD3DAdapters();
device = new gs_device(adapter);
obs-studio-18.0.1.tar.xz/libobs-opengl/gl-stagesurf.c -> obs-studio-18.0.2.tar.xz/libobs-opengl/gl-stagesurf.c
Changed
return false;
size = surf->width * surf->bytes_per_pixel;
- size = (size+3) & 0xFFFFFFFC; /* align width to 4-byte boundry */
+ size = (size+3) & 0xFFFFFFFC; /* align width to 4-byte boundary */
size *= surf->height;
glBufferData(GL_PIXEL_PACK_BUFFER, size, 0, GL_DYNAMIC_READ);
#ifdef __APPLE__
/* Apparently for mac, PBOs won't do an asynchronous transfer unless you use
- * FBOs aong with glReadPixels, which is really dumb. */
+ * FBOs along with glReadPixels, which is really dumb. */
void device_stage_texture(gs_device_t *device, gs_stagesurf_t *dst,
gs_texture_t *src)
{
obs-studio-18.0.1.tar.xz/libobs-opengl/gl-subsystem.c -> obs-studio-18.0.2.tar.xz/libobs-opengl/gl-subsystem.c
Changed
struct gs_device *device = bzalloc(sizeof(struct gs_device));
int errorcode = GS_ERROR_FAIL;
+ blog(LOG_INFO, "---------------------------------");
+ blog(LOG_INFO, "Initializing OpenGL...");
+
device->plat = gl_platform_create(device, adapter);
if (!device->plat)
goto fail;
errorcode = GS_ERROR_NOT_SUPPORTED;
goto fail;
}
+
+ blog(LOG_INFO, "OpenGL version: %s", glGetString(GL_VERSION));
gl_enable(GL_CULL_FACE);
obs-studio-18.0.1.tar.xz/libobs-opengl/gl-windows.c -> obs-studio-18.0.2.tar.xz/libobs-opengl/gl-windows.c
Changed
}
}
-/* would use designated initializers but microsoft sort of sucks */
+/* would use designated initializers but Microsoft sort of sucks */
static inline void init_dummy_pixel_format(PIXELFORMATDESCRIPTOR *pfd)
{
memset(pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
obs-studio-18.0.1.tar.xz/libobs-opengl/gl-x11.c -> obs-studio-18.0.2.tar.xz/libobs-opengl/gl-x11.c
Changed
goto fail_load_gl;
}
- blog(LOG_INFO, "OpenGL version: %s\n", glGetString(GL_VERSION));
-
goto success;
fail_make_current:
obs-studio-18.0.1.tar.xz/libobs/audio-monitoring/osx/coreaudio-enum-devices.c -> obs-studio-18.0.2.tar.xz/libobs/audio-monitoring/osx/coreaudio-enum-devices.c
Changed
return (bool)CFStringGetCString(ref, buf, size, kCFStringEncodingUTF8);
}
-static void obs_enum_audio_monitoring_device(obs_enum_audio_device_cb cb,
- void *data, AudioDeviceID id)
+static bool obs_enum_audio_monitoring_device(obs_enum_audio_device_cb cb,
+ void *data, AudioDeviceID id, bool allow_inputs)
{
UInt32 size = 0;
CFStringRef cf_name = NULL;
char name[1024];
char uid[1024];
OSStatus stat;
+ bool cont = true;
AudioObjectPropertyAddress addr = {
kAudioDevicePropertyStreams,
};
/* check to see if it's a mac input device */
- AudioObjectGetPropertyDataSize(id, &addr, 0, NULL, &size);
- if (!size)
- return;
+ if (!allow_inputs) {
+ AudioObjectGetPropertyDataSize(id, &addr, 0, NULL, &size);
+ if (!size)
+ return true;
+ }
size = sizeof(CFStringRef);
addr.mSelector = kAudioDevicePropertyDeviceUID;
stat = AudioObjectGetPropertyData(id, &addr, 0, NULL, &size, &cf_uid);
if (!success(stat, "get audio device UID"))
- return;
+ return true;
addr.mSelector = kAudioDevicePropertyDeviceNameCFString;
stat = AudioObjectGetPropertyData(id, &addr, 0, NULL, &size, &cf_name);
goto fail;
}
- cb(data, name, uid);
+ cont = cb(data, name, uid);
fail:
if (cf_name)
CFRelease(cf_name);
if (cf_uid)
CFRelease(cf_uid);
+ return cont;
}
-void obs_enum_audio_monitoring_devices(obs_enum_audio_device_cb cb, void *data)
+static void enum_audio_devices(obs_enum_audio_device_cb cb, void *data,
+ bool allow_inputs)
{
AudioObjectPropertyAddress addr = {
kAudioHardwarePropertyDevices,
stat = AudioObjectGetPropertyData(kAudioObjectSystemObject, &addr,
0, NULL, &size, ids);
if (success(stat, "get data")) {
- for (UInt32 i = 0; i < count; i++)
- obs_enum_audio_monitoring_device(cb, data, ids[i]);
+ for (UInt32 i = 0; i < count; i++) {
+ if (!obs_enum_audio_monitoring_device(cb, data, ids[i],
+ allow_inputs))
+ break;
+ }
}
free(ids);
}
+
+void obs_enum_audio_monitoring_devices(obs_enum_audio_device_cb cb, void *data)
+{
+ enum_audio_devices(cb, data, false);
+}
+
+static bool alloc_default_id(void *data, const char *name, const char *id)
+{
+ char **p_id = data;
+ UNUSED_PARAMETER(name);
+
+ *p_id = bstrdup(id);
+ return false;
+}
+
+static void get_default_id(char **p_id)
+{
+ AudioObjectPropertyAddress addr = {
+ kAudioHardwarePropertyDefaultSystemOutputDevice,
+ kAudioObjectPropertyScopeGlobal,
+ kAudioObjectPropertyElementMaster
+ };
+
+ if (*p_id)
+ return;
+
+ OSStatus stat;
+ AudioDeviceID id = 0;
+ UInt32 size = sizeof(id);
+
+ stat = AudioObjectGetPropertyData(kAudioObjectSystemObject, &addr, 0,
+ NULL, &size, &id);
+ if (success(stat, "AudioObjectGetPropertyData"))
+ obs_enum_audio_monitoring_device(alloc_default_id, p_id, id,
+ true);
+ if (!*p_id)
+ *p_id = bzalloc(1);
+}
+
+struct device_name_info {
+ const char *id;
+ char *name;
+};
+
+static bool enum_device_name(void *data, const char *name, const char *id)
+{
+ struct device_name_info *info = data;
+
+ if (strcmp(info->id, id) == 0) {
+ info->name = bstrdup(name);
+ return false;
+ }
+
+ return true;
+}
+
+bool devices_match(const char *id1, const char *id2)
+{
+ struct device_name_info info = {0};
+ char *default_id = NULL;
+ char *name1 = NULL;
+ char *name2 = NULL;
+ bool match;
+
+ if (!id1 || !id2)
+ return false;
+
+ if (strcmp(id1, "default") == 0) {
+ get_default_id(&default_id);
+ id1 = default_id;
+ }
+ if (strcmp(id2, "default") == 0) {
+ get_default_id(&default_id);
+ id2 = default_id;
+ }
+
+ info.id = id1;
+ enum_audio_devices(enum_device_name, &info, true);
+ name1 = info.name;
+
+ info.name = NULL;
+ info.id = id2;
+ enum_audio_devices(enum_device_name, &info, true);
+ name2 = info.name;
+
+ match = name1 && name2 && strcmp(name1, name2) == 0;
+ bfree(default_id);
+ bfree(name1);
+ bfree(name2);
+
+ return match;
+}
obs-studio-18.0.1.tar.xz/libobs/audio-monitoring/osx/coreaudio-output.c -> obs-studio-18.0.2.tar.xz/libobs/audio-monitoring/osx/coreaudio-output.c
Changed
volatile bool active;
bool paused;
+ bool ignore;
};
static inline bool fill_buffer(struct audio_monitor *monitor)
UNUSED_PARAMETER(aq);
}
-static bool audio_monitor_init(struct audio_monitor *monitor)
+extern bool devices_match(const char *id1, const char *id2);
+
+static bool audio_monitor_init(struct audio_monitor *monitor,
+ obs_source_t *source)
{
const struct audio_output_info *info = audio_output_get_info(
obs->audio.audio);
.mBitsPerChannel = sizeof(float) * 8
};
+ monitor->source = source;
+
monitor->channels = channels;
monitor->buffer_size =
channels * sizeof(float) * info->samples_per_sec / 100 * 3;
pthread_mutex_init_value(&monitor->mutex);
- stat = AudioQueueNewOutput(&desc, buffer_audio, monitor, NULL, NULL, 0,
- &monitor->queue);
- if (!success(stat, "AudioStreamBasicDescription")) {
+ const char *uid = obs->audio.monitoring_device_id;
+ if (!uid || !*uid) {
return false;
}
- const char *uid = obs->audio.monitoring_device_id;
- if (!uid || !*uid) {
+ if (source->info.output_flags & OBS_SOURCE_DO_NOT_SELF_MONITOR) {
+ obs_data_t *s = obs_source_get_settings(source);
+ const char *s_dev_id = obs_data_get_string(s, "device_id");
+ bool match = devices_match(s_dev_id, uid);
+ obs_data_release(s);
+
+ if (match) {
+ monitor->ignore = true;
+ return true;
+ }
+ }
+
+ stat = AudioQueueNewOutput(&desc, buffer_audio, monitor, NULL, NULL, 0,
+ &monitor->queue);
+ if (!success(stat, "AudioStreamBasicDescription")) {
return false;
}
pthread_mutex_destroy(&monitor->mutex);
}
-static void audio_monitor_init_final(struct audio_monitor *monitor,
- obs_source_t *source)
+static void audio_monitor_init_final(struct audio_monitor *monitor)
{
- monitor->source = source;
- obs_source_add_audio_capture_callback(source, on_audio_playback,
- monitor);
+ if (monitor->ignore)
+ return;
+
+ obs_source_add_audio_capture_callback(monitor->source,
+ on_audio_playback, monitor);
}
struct audio_monitor *audio_monitor_create(obs_source_t *source)
{
struct audio_monitor *monitor = bzalloc(sizeof(*monitor));
- if (!audio_monitor_init(monitor)) {
+ if (!audio_monitor_init(monitor, source)) {
goto fail;
}
da_push_back(obs->audio.monitors, &monitor);
pthread_mutex_unlock(&obs->audio.monitoring_mutex);
- audio_monitor_init_final(monitor, source);
+ audio_monitor_init_final(monitor);
return monitor;
fail:
audio_monitor_free(monitor);
memset(monitor, 0, sizeof(*monitor));
- success = audio_monitor_init(monitor);
+ success = audio_monitor_init(monitor, source);
if (success)
- audio_monitor_init_final(monitor, source);
+ audio_monitor_init_final(monitor);
}
void audio_monitor_destroy(struct audio_monitor *monitor)
obs-studio-18.0.1.tar.xz/libobs/audio-monitoring/win32/wasapi-enum-devices.c -> obs-studio-18.0.2.tar.xz/libobs/audio-monitoring/win32/wasapi-enum-devices.c
Changed
safe_release(enumerator);
safe_release(collection);
}
+
+static void get_default_id(char **p_id)
+{
+ IMMDeviceEnumerator *immde = NULL;
+ IMMDevice *device = NULL;
+ WCHAR *w_id = NULL;
+ HRESULT hr;
+
+ if (*p_id)
+ return;
+
+ hr = CoCreateInstance(&CLSID_MMDeviceEnumerator, NULL, CLSCTX_ALL,
+ &IID_IMMDeviceEnumerator, &immde);
+ if (FAILED(hr)) {
+ goto fail;
+ }
+
+ hr = immde->lpVtbl->GetDefaultAudioEndpoint(immde,
+ eRender, eConsole, &device);
+ if (FAILED(hr)) {
+ goto fail;
+ }
+
+ hr = device->lpVtbl->GetId(device, &w_id);
+ if (FAILED(hr)) {
+ goto fail;
+ }
+
+ os_wcs_to_utf8_ptr(w_id, 0, p_id);
+
+fail:
+ if (!*p_id)
+ *p_id = bzalloc(1);
+ if (immde)
+ immde->lpVtbl->Release(immde);
+ if (device)
+ device->lpVtbl->Release(device);
+ if (w_id)
+ CoTaskMemFree(w_id);
+}
+
+bool devices_match(const char *id1, const char *id2)
+{
+ char *default_id = NULL;
+ bool match;
+
+ if (!id1 || !id2)
+ return false;
+
+ if (strcmp(id1, "default") == 0) {
+ get_default_id(&default_id);
+ id1 = default_id;
+ }
+ if (strcmp(id2, "default") == 0) {
+ get_default_id(&default_id);
+ id2 = default_id;
+ }
+
+ match = strcmp(id1, id2) == 0;
+ bfree(default_id);
+
+ return match;
+}
obs-studio-18.0.1.tar.xz/libobs/audio-monitoring/win32/wasapi-output.c -> obs-studio-18.0.2.tar.xz/libobs/audio-monitoring/win32/wasapi-output.c
Changed
uint32_t sample_rate;
uint32_t channels;
bool source_has_video : 1;
+ bool ignore : 1;
int64_t lowest_audio_offset;
struct circlebuf delay_buffer;
static inline void audio_monitor_free(struct audio_monitor *monitor)
{
+ if (monitor->ignore)
+ return;
+
if (monitor->source) {
obs_source_remove_audio_capture_callback(
monitor->source, on_audio_playback, monitor);
return (enum speaker_layout)channels;
}
-static bool audio_monitor_init(struct audio_monitor *monitor)
+extern bool devices_match(const char *id1, const char *id2);
+
+static bool audio_monitor_init(struct audio_monitor *monitor,
+ obs_source_t *source)
{
IMMDeviceEnumerator *immde = NULL;
WAVEFORMATEX *wfex = NULL;
UINT32 frames;
HRESULT hr;
+ pthread_mutex_init_value(&monitor->playback_mutex);
+
+ monitor->source = source;
+
const char *id = obs->audio.monitoring_device_id;
if (!id) {
return false;
}
- pthread_mutex_init_value(&monitor->playback_mutex);
+ if (source->info.output_flags & OBS_SOURCE_DO_NOT_SELF_MONITOR) {
+ obs_data_t *s = obs_source_get_settings(source);
+ const char *s_dev_id = obs_data_get_string(s, "device_id");
+ bool match = devices_match(s_dev_id, id);
+ obs_data_release(s);
+
+ if (match) {
+ monitor->ignore = true;
+ return true;
+ }
+ }
/* ------------------------------------------ *
* Init device */
return success;
}
-static void audio_monitor_init_final(struct audio_monitor *monitor,
- obs_source_t *source)
+static void audio_monitor_init_final(struct audio_monitor *monitor)
{
- monitor->source = source;
+ if (monitor->ignore)
+ return;
+
monitor->source_has_video =
- (source->info.output_flags & OBS_SOURCE_VIDEO) != 0;
- obs_source_add_audio_capture_callback(source, on_audio_playback,
- monitor);
+ (monitor->source->info.output_flags & OBS_SOURCE_VIDEO) != 0;
+ obs_source_add_audio_capture_callback(monitor->source,
+ on_audio_playback, monitor);
}
struct audio_monitor *audio_monitor_create(obs_source_t *source)
struct audio_monitor monitor = {0};
struct audio_monitor *out;
- if (!audio_monitor_init(&monitor)) {
+ if (!audio_monitor_init(&monitor, source)) {
goto fail;
}
da_push_back(obs->audio.monitors, &out);
pthread_mutex_unlock(&obs->audio.monitoring_mutex);
- audio_monitor_init_final(out, source);
+ audio_monitor_init_final(out);
return out;
fail:
bool success;
pthread_mutex_lock(&monitor->playback_mutex);
- success = audio_monitor_init(&new_monitor);
+ success = audio_monitor_init(&new_monitor, monitor->source);
pthread_mutex_unlock(&monitor->playback_mutex);
if (success) {
obs_source_t *source = monitor->source;
audio_monitor_free(monitor);
*monitor = new_monitor;
- audio_monitor_init_final(monitor, source);
+ audio_monitor_init_final(monitor);
} else {
audio_monitor_free(&new_monitor);
}
obs-studio-18.0.1.tar.xz/libobs/callback/calldata.h -> obs-studio-18.0.2.tar.xz/libobs/callback/calldata.h
Changed
}
/* ------------------------------------------------------------------------- */
-/* NOTE: 'get' functions return true only if paramter exists, and is the
+/* NOTE: 'get' functions return true only if parameter exists, and is the
* same type. They return false otherwise. */
static inline bool calldata_get_int(const calldata_t *data, const char *name,
obs-studio-18.0.1.tar.xz/libobs/callback/decl.c -> obs-studio-18.0.2.tar.xz/libobs/callback/decl.c
Changed
int code;
struct decl_param param = {0};
- /* get stprage specifiers */
+ /* get storage specifiers */
code = cf_next_name_ref(cfp, &ref, TYPE_OR_STORAGE, ",");
if (code != PARSE_SUCCESS)
return code;
return code;
}
- /* parameters not marked with specifers are input parameters */
+ /* parameters not marked with specifiers are input parameters */
if (param.flags == 0)
param.flags = CALL_PARAM_IN;
ret_param.flags = CALL_PARAM_OUT;
cf_parser_init(&cfp);
- if (!cf_parser_parse(&cfp, decl_string, "declaraion"))
+ if (!cf_parser_parse(&cfp, decl_string, "declaration"))
goto fail;
code = cf_get_name_ref(&cfp, &ret_type, "return type", NULL);
obs-studio-18.0.1.tar.xz/libobs/graphics/graphics.h -> obs-studio-18.0.2.tar.xz/libobs/graphics/graphics.h
Changed
* Draws a 2D sprite
*
* If width or height is 0, the width or height of the texture will be used.
- * The flip value specifies whether the texture shoudl be flipped on the U or V
+ * The flip value specifies whether the texture should be flipped on the U or V
* axis with GS_FLIP_U and GS_FLIP_V.
*/
EXPORT void gs_draw_sprite(gs_texture_t *tex, uint32_t flip, uint32_t width,
/** sets the viewport to current swap chain size */
EXPORT void gs_reset_viewport(void);
-/** sets default screen-sized orthographich mode */
+/** sets default screen-sized orthographic mode */
EXPORT void gs_set_2d_mode(void);
/** sets default screen-sized perspective mode */
EXPORT void gs_set_3d_mode(double fovy, double znear, double zvar);
obs-studio-18.0.1.tar.xz/libobs/obs-audio-controls.h -> obs-studio-18.0.2.tar.xz/libobs/obs-audio-controls.h
Changed
EXPORT void obs_volmeter_detach_source(obs_volmeter_t *volmeter);
/**
- * @brief Get signal handler for the volume meter object
- * @param volmeter pointer to the volume meter object
- * @return signal handler
- */
-EXPORT signal_handler_t *obs_volmeter_get_signal_handler(
- obs_volmeter_t *volmeter);
-
-/**
* @brief Set the update interval for the volume meter
* @param volmeter pointer to the volume meter object
* @param ms update interval in ms
obs-studio-18.0.1.tar.xz/libobs/obs-audio.c -> obs-studio-18.0.2.tar.xz/libobs/obs-audio.c
Changed
/* if perpetually pending data, it means the audio has stopped,
* so clear the audio data */
if (last_size == size) {
+ if (!source->pending_stop) {
+ source->pending_stop = true;
+#if DEBUG_AUDIO == 1
+ blog(LOG_DEBUG, "doing pending stop trick: '%s'",
+ source->context.name);
+#endif
+ return true;
+ }
+
for (size_t ch = 0; ch < channels; ch++)
circlebuf_pop_front(&source->audio_input_buf[ch], NULL,
source->audio_input_buf[ch].size);
+ source->pending_stop = false;
source->audio_ts = 0;
source->last_audio_input_buf_size = 0;
#if DEBUG_AUDIO == 1
if (start_point == AUDIO_OUTPUT_FRAMES) {
#if DEBUG_AUDIO == 1
if (is_audio_source)
- blog(LOG_DEBUG, "can't dicard, start point is "
+ blog(LOG_DEBUG, "can't discard, start point is "
"at audio frame count");
#endif
return;
ts->end);
#endif
+ source->pending_stop = false;
source->audio_ts = ts->end;
}
obs-studio-18.0.1.tar.xz/libobs/obs-config.h -> obs-studio-18.0.2.tar.xz/libobs/obs-config.h
Changed
*
* Reset to zero each major or minor version
*/
-#define LIBOBS_API_PATCH_VER 1
+#define LIBOBS_API_PATCH_VER 2
#define MAKE_SEMANTIC_VERSION(major, minor, patch) \
((major << 24) | \
obs-studio-18.0.1.tar.xz/libobs/obs-encoder.h -> obs-studio-18.0.2.tar.xz/libobs/obs-encoder.h
Changed
/**
* Updates the settings for this encoder (usually used for things like
- * changeing birate while active)
+ * changing bitrate while active)
*
* @param data Data associated with this encoder context
* @param settings New settings for this encoder
obs-studio-18.0.1.tar.xz/libobs/obs-hotkey.h -> obs-studio-18.0.2.tar.xz/libobs/obs-hotkey.h
Changed
* that may not have translations. If the operating system can provide
* translations for these keys, it will use the operating system's translation
* over these translations. If no translations are specified, it will use
- * the default english translations for that specific operating system. */
+ * the default English translations for that specific operating system. */
EXPORT void obs_hotkeys_set_translations_s(
struct obs_hotkeys_translations *translations, size_t size);
obs-studio-18.0.1.tar.xz/libobs/obs-internal.h -> obs-studio-18.0.2.tar.xz/libobs/obs-internal.h
Changed
/* audio */
bool audio_failed;
bool audio_pending;
+ bool pending_stop;
bool user_muted;
bool muted;
struct obs_source *next_audio_source;
bool async_flip;
bool async_active;
bool async_update_texture;
+ struct obs_source_frame *async_preload_frame;
DARRAY(struct async_frame) async_cache;
DARRAY(struct obs_source_frame*)async_frames;
pthread_mutex_t async_mutex;
obs-studio-18.0.1.tar.xz/libobs/obs-module.h -> obs-studio-18.0.2.tar.xz/libobs/obs-module.h
Changed
* may need loading.
*
* @return Return true to continue loading the module, otherwise
- * false to indcate failure and unload the module
+ * false to indicate failure and unload the module
*/
MODULE_EXPORT bool obs_module_load(void);
obs-studio-18.0.1.tar.xz/libobs/obs-output.c -> obs-studio-18.0.2.tar.xz/libobs/obs-output.c
Changed
struct encoder_packet out = output->interleaved_packets.array[0];
/* do not send an interleaved packet if there's no packet of the
- * opposing type of a higher timstamp in the interleave buffer.
+ * opposing type of a higher timestamp in the interleave buffer.
* this ensures that the timestamps are monotonic */
if (!has_higher_opposing_ts(output, &out))
return;
if (!active(output))
return;
- // split text into 32 charcter strings
+ // split text into 32 character strings
int size = (int)strlen(text);
int r;
size_t char_count;
obs-studio-18.0.1.tar.xz/libobs/obs-scene.h -> obs-studio-18.0.2.tar.xz/libobs/obs-scene.h
Changed
uint32_t align;
/* last width/height of the source, this is used to check whether
- * ths transform needs updating */
+ * the transform needs updating */
uint32_t last_width;
uint32_t last_height;
obs-studio-18.0.1.tar.xz/libobs/obs-source.c -> obs-studio-18.0.2.tar.xz/libobs/obs-source.c
Changed
if (!private)
obs_source_init_audio_hotkeys(source);
+ source->flags = source->default_flags;
+
/* allow the source to be created even if creation fails so that the
* user's data doesn't become lost */
if (info)
private ? "private " : "", name, id);
obs_source_dosignal(source, "source_create", NULL);
- source->flags = source->default_flags;
source->enabled = true;
return source;
return obs_source_create_internal(id, name, settings, NULL, true);
}
+static char *get_new_filter_name(obs_source_t *dst, const char *name)
+{
+ struct dstr new_name = {0};
+ int inc = 0;
+
+ dstr_copy(&new_name, name);
+
+ for (;;) {
+ obs_source_t *existing_filter = obs_source_get_filter_by_name(
+ dst, new_name.array);
+ if (!existing_filter)
+ break;
+
+ obs_source_release(existing_filter);
+
+ dstr_printf(&new_name, "%s %d", name, ++inc + 1);
+ }
+
+ return new_name.array;
+}
+
static void duplicate_filters(obs_source_t *dst, obs_source_t *src,
bool private)
{
for (size_t i = filters.num; i > 0; i--) {
obs_source_t *src_filter = filters.array[i - 1];
+ char *new_name = get_new_filter_name(dst,
+ src_filter->context.name);
+
obs_source_t *dst_filter = obs_source_duplicate(src_filter,
- src_filter->context.name, private);
+ new_name, private);
+ bfree(new_name);
obs_source_filter_add(dst, dst_filter);
obs_source_release(dst_filter);
obs_source_release(src_filter);
da_free(filters);
}
+void obs_source_copy_filters(obs_source_t *dst, obs_source_t *src)
+{
+ duplicate_filters(dst, src, dst->context.private ?
+ OBS_SCENE_DUP_PRIVATE_COPY :
+ OBS_SCENE_DUP_COPY);
+
+ obs_source_release(src);
+}
+
obs_source_t *obs_source_duplicate(obs_source_t *source,
const char *new_name, bool create_private)
{
audio_resampler_destroy(source->resampler);
bfree(source->audio_output_buf[0][0]);
+ obs_source_frame_destroy(source->async_preload_frame);
+
if (source->info.type == OBS_SOURCE_TYPE_TRANSITION)
obs_transition_free(source);
source->last_audio_input_buf_size = 0;
source->audio_ts = os_time;
+ source->next_audio_sys_ts_min = os_time;
}
static void handle_ts_jump(obs_source_t *source, uint64_t expected,
}
}
+static inline bool preload_frame_changed(obs_source_t *source,
+ const struct obs_source_frame *in)
+{
+ if (!source->async_preload_frame)
+ return true;
+
+ return in->width != source->async_preload_frame->width ||
+ in->height != source->async_preload_frame->height ||
+ in->format != source->async_preload_frame->format;
+}
+
+void obs_source_preload_video(obs_source_t *source,
+ const struct obs_source_frame *frame)
+{
+ if (!obs_source_valid(source, "obs_source_preload_video"))
+ return;
+ if (!frame)
+ return;
+
+ obs_enter_graphics();
+
+ if (preload_frame_changed(source, frame)) {
+ obs_source_frame_destroy(source->async_preload_frame);
+ source->async_preload_frame = obs_source_frame_create(
+ frame->format,
+ frame->width,
+ frame->height);
+ }
+
+ copy_frame_data(source->async_preload_frame, frame);
+ set_async_texture_size(source, source->async_preload_frame);
+ update_async_texture(source, source->async_preload_frame,
+ source->async_texture,
+ source->async_texrender);
+
+ source->last_frame_ts = frame->timestamp;
+
+ obs_leave_graphics();
+}
+
+void obs_source_show_preloaded_video(obs_source_t *source)
+{
+ uint64_t sys_ts;
+
+ if (!obs_source_valid(source, "obs_source_show_preloaded_video"))
+ return;
+
+ source->async_active = true;
+
+ pthread_mutex_lock(&source->audio_buf_mutex);
+ sys_ts = os_gettime_ns();
+ reset_audio_timing(source, source->last_frame_ts, sys_ts);
+ reset_audio_data(source, sys_ts);
+ pthread_mutex_unlock(&source->audio_buf_mutex);
+}
+
static inline struct obs_audio_data *filter_async_audio(obs_source_t *source,
struct obs_audio_data *in)
{
if (!obs_source_valid(source, "obs_source_set_monitoring_type"))
return;
- if (source->info.output_flags & OBS_SOURCE_DO_NOT_MONITOR)
- return;
if (source->monitoring_type == type)
return;
obs-studio-18.0.1.tar.xz/libobs/obs-source.h -> obs-studio-18.0.2.tar.xz/libobs/obs-source.h
Changed
/**
* Source cannot have its audio monitored
*
- * Specifies that this source may cause a feedback loop if audio is monitored.
+ * Specifies that this source may cause a feedback loop if audio is monitored
+ * with a device selected as desktop audio.
+ *
* This is used primarily with desktop audio capture sources.
*/
-#define OBS_SOURCE_DO_NOT_MONITOR (1<<9)
+#define OBS_SOURCE_DO_NOT_SELF_MONITOR (1<<9)
/** @} */
* Creates the source data for the source
*
* @param settings Settings to initialize the source with
- * @param source Source that this data is assoicated with
+ * @param source Source that this data is associated with
* @return The data associated with this source
*/
void *(*create)(obs_data_t *settings, obs_source_t *source);
* If the source output flags do not include SOURCE_CUSTOM_DRAW, all
* a source needs to do is set the "image" parameter of the effect to
* the desired texture, and then draw. If the output flags include
- * SOURCE_COLOR_MATRIX, you may optionally set the the "color_matrix"
+ * SOURCE_COLOR_MATRIX, you may optionally set the "color_matrix"
* parameter of the effect to a custom 4x4 conversion matrix (by
* default it will be set to an YUV->RGB conversion matrix)
*
size_t size);
/**
- * Regsiters a source definition to the current obs context. This should be
+ * Registers a source definition to the current obs context. This should be
* used in obs_module_load.
*
* @param info Pointer to the source definition structure
obs-studio-18.0.1.tar.xz/libobs/obs-ui.h -> obs-studio-18.0.2.tar.xz/libobs/obs-ui.h
Changed
};
/**
- * Regsiters a modal UI definition to the current obs context. This should be
+ * Registers a modal UI definition to the current obs context. This should be
* used in obs_module_load.
*
* @param info Pointer to the modal definition structure
obs-studio-18.0.1.tar.xz/libobs/obs.c -> obs-studio-18.0.2.tar.xz/libobs/obs.c
Changed
} while (false)
FREE_REGISTERED_TYPES(obs_source_info, obs->source_types);
- FREE_REGISTERED_TYPES(obs_source_info, obs->input_types);
- FREE_REGISTERED_TYPES(obs_source_info, obs->filter_types);
- FREE_REGISTERED_TYPES(obs_source_info, obs->transition_types);
FREE_REGISTERED_TYPES(obs_output_info, obs->output_types);
FREE_REGISTERED_TYPES(obs_encoder_info, obs->encoder_types);
FREE_REGISTERED_TYPES(obs_service_info, obs->service_types);
#undef FREE_REGISTERED_TYPES
+ da_free(obs->input_types);
+ da_free(obs->filter_types);
+ da_free(obs->transition_types);
+
stop_video();
stop_hotkeys();
obs_data_t *hotkeys = obs_data_get_obj(source_data, "hotkeys");
double volume;
int64_t sync;
- uint32_t flags;
uint32_t mixers;
int di_order;
int di_mode;
mixers = (uint32_t)obs_data_get_int(source_data, "mixers");
obs_source_set_audio_mixers(source, mixers);
- obs_data_set_default_int(source_data, "flags", source->default_flags);
- flags = (uint32_t)obs_data_get_int(source_data, "flags");
- obs_source_set_flags(source, flags);
-
obs_data_set_default_bool(source_data, "enabled", true);
obs_source_set_enabled(source,
obs_data_get_bool(source_data, "enabled"));
float volume = obs_source_get_volume(source);
uint32_t mixers = obs_source_get_audio_mixers(source);
int64_t sync = obs_source_get_sync_offset(source);
- uint32_t flags = obs_source_get_flags(source);
const char *name = obs_source_get_name(source);
const char *id = obs_source_get_id(source);
bool enabled = obs_source_enabled(source);
obs_data_set_obj (source_data, "settings", settings);
obs_data_set_int (source_data, "mixers", mixers);
obs_data_set_int (source_data, "sync", sync);
- obs_data_set_int (source_data, "flags", flags);
obs_data_set_double(source_data, "volume", volume);
obs_data_set_bool (source_data, "enabled", enabled);
obs_data_set_bool (source_data, "muted", muted);
obs-studio-18.0.1.tar.xz/libobs/obs.h -> obs-studio-18.0.2.tar.xz/libobs/obs.h
Changed
EXPORT profiler_name_store_t *obs_get_profiler_name_store(void);
/**
- * Sets base video ouput base resolution/fps/format.
+ * Sets base video output base resolution/fps/format.
*
- * @note This data cannot be changed if an output is corrently active.
+ * @note This data cannot be changed if an output is currently active.
* @note The graphics module cannot be changed without fully destroying the
* OBS context.
*
* @param ovi Pointer to an obs_video_info structure containing the
* specification of the graphics subsystem,
- * @return OBS_VIDEO_SUCCESS if sucessful
+ * @return OBS_VIDEO_SUCCESS if successful
* OBS_VIDEO_NOT_SUPPORTED if the adapter lacks capabilities
* OBS_VIDEO_INVALID_PARAM if a parameter is invalid
* OBS_VIDEO_CURRENTLY_ACTIVE if video is currently active
/**
* Initializes the module, which calls its obs_module_load export. If the
- * module is alrady loaded, then this function does nothing and returns
+ * module is already loaded, then this function does nothing and returns
* successful.
*/
EXPORT bool obs_init_module(obs_module_t *module);
EXPORT obs_source_t *obs_source_get_filter_by_name(obs_source_t *source,
const char *name);
+EXPORT void obs_source_copy_filters(obs_source_t *dst, obs_source_t *src);
+
EXPORT bool obs_source_enabled(const obs_source_t *source);
EXPORT void obs_source_set_enabled(obs_source_t *source, bool enabled);
EXPORT void obs_source_output_video(obs_source_t *source,
const struct obs_source_frame *frame);
+/** Preloads asynchronous video data to allow instantaneous playback */
+EXPORT void obs_source_preload_video(obs_source_t *source,
+ const struct obs_source_frame *frame);
+
+/** Shows any preloaded video data */
+EXPORT void obs_source_show_preloaded_video(obs_source_t *source);
+
/** Outputs audio data (always asynchronous) */
EXPORT void obs_source_output_audio(obs_source_t *source,
const struct obs_source_audio *audio);
* Creates a scene.
*
* A scene is a source which is a container of other sources with specific
- * display oriantations. Scenes can also be used like any other source.
+ * display orientations. Scenes can also be used like any other source.
*/
EXPORT obs_scene_t *obs_scene_create(const char *name);
EXPORT void obs_sceneitem_select(obs_sceneitem_t *item, bool select);
EXPORT bool obs_sceneitem_selected(const obs_sceneitem_t *item);
-/* Functions for gettings/setting specific orientation of a scene item */
+/* Functions for getting/setting specific orientation of a scene item */
EXPORT void obs_sceneitem_set_pos(obs_sceneitem_t *item, const struct vec2 *pos);
EXPORT void obs_sceneitem_set_rot(obs_sceneitem_t *item, float rot_deg);
EXPORT void obs_sceneitem_set_scale(obs_sceneitem_t *item,
EXPORT proc_handler_t *obs_output_get_proc_handler(const obs_output_t *output);
/**
- * Sets the current video media context associated with this output,
- * required for non-encoded outputs
- */
-EXPORT void obs_output_set_video(obs_output_t *output, video_t *video);
-
-/**
* Sets the current audio/video media contexts associated with this output,
* required for non-encoded outputs. Can be null.
*/
obs-studio-18.0.1.tar.xz/libobs/util/base.h -> obs-studio-18.0.2.tar.xz/libobs/util/base.h
Changed
* Use if a problem occurs that doesn't affect the program and is
* recoverable.
*
- * Use in places where where failure isn't entirely unexpected, and can
+ * Use in places where failure isn't entirely unexpected, and can
* be handled safely.
*/
LOG_WARNING = 200,
/**
- * Informative essage to be displayed in the log.
+ * Informative message to be displayed in the log.
*/
LOG_INFO = 300,
obs-studio-18.0.1.tar.xz/libobs/util/c99defs.h -> obs-studio-18.0.2.tar.xz/libobs/util/c99defs.h
Changed
* incredibly inept moron could possibly be managing the visual C compiler
* project. They should be fired, and legally forbidden to have a job in
* ANYTHING even REMOTELY related to programming. FOREVER. This should also
- * apply to the next 10 generations all of their descendents. */
+ * apply to the next 10 generations all of their descendants. */
#ifndef __cplusplus
#define inline __inline
#endif
obs-studio-18.0.1.tar.xz/libobs/util/cf-lexer.h -> obs-studio-18.0.2.tar.xz/libobs/util/cf-lexer.h
Changed
* + option to exclude features such as #import, variadic macros, and other
* features for certain language implementations
* + macro parameter string operator #
- * + macro parameter token concactenation operator ##
+ * + macro parameter token concatenation operator ##
* + predefined macros
* + restricted macros
*/
obs-studio-18.0.1.tar.xz/libobs/util/config-file.h -> obs-studio-18.0.2.tar.xz/libobs/util/config-file.h
Changed
* These do *not* actually set any values, they only set what values will be
* returned for config_get_* if the specified variable does not exist.
*
- * You can initialize the defaults programmitically using config_set_default_*
+ * You can initialize the defaults programmatically using config_set_default_*
* functions (recommended for most cases), or you can initialize it via a file
* with config_open_defaults.
*/
obs-studio-18.0.1.tar.xz/libobs/util/darray.h -> obs-studio-18.0.2.tar.xz/libobs/util/darray.h
Changed
* NOTE: Not type-safe when using directly.
* Specifying size per call with inline maximizes compiler optimizations
*
- * See DARRAY macro at the bottom of thhe file for slightly safer usage.
+ * See DARRAY macro at the bottom of the file for slightly safer usage.
*/
#define DARRAY_INVALID ((size_t)-1)
* Makes it a little easier to use as well.
*
* I did -not- want to use a gigantic macro to generate a crapload of
- * typsafe inline functions per type. It just feels like a mess to me.
+ * typesafe inline functions per type. It just feels like a mess to me.
*/
#define DARRAY(type) \
obs-studio-18.0.1.tar.xz/libobs/util/text-lookup.h -> obs-studio-18.0.2.tar.xz/libobs/util/text-lookup.h
Changed
/*
* Text Lookup interface
*
- * Used for storing and looking up localized strings. Stores locazation
+ * Used for storing and looking up localized strings. Stores localization
* strings in a radix/trie tree to efficiently look up associated strings via a
* unique string identifier name.
*/
extern "C" {
#endif
-/* opaque typdef */
+/* opaque typedef */
struct text_lookup;
typedef struct text_lookup lookup_t;
obs-studio-18.0.1.tar.xz/libobs/util/utf8.c -> obs-studio-18.0.2.tar.xz/libobs/util/utf8.c
Changed
/*
* NOTE: do not check here for forbidden UTF-8 characters.
- * They cannot appear here because we do proper convertion.
+ * They cannot appear here because we do proper conversion.
*/
p += n;
obs-studio-18.0.1.tar.xz/libobs/util/vc/vc_stdint.h -> obs-studio-18.0.2.tar.xz/libobs/util/vc/vc_stdint.h
Changed
/* 7.18.4.1 Macros for minimum-width integer constants
- Accoding to Douglas Gwyn <gwyn@arl.mil>:
+ According to Douglas Gwyn <gwyn@arl.mil>:
"This spec was changed in ISO/IEC 9899:1999 TC1; in ISO/IEC
9899:1999 as initially published, the expansion was required
to be an integer constant of precisely matching type, which
obs-studio-18.0.2.tar.xz/plugins/decklink/audio-repack.c
Added
+#include "audio-repack.h"
+
+#include <emmintrin.h>
+
+int check_buffer(struct audio_repack *repack,
+ uint32_t frame_count)
+{
+ const uint32_t new_size = frame_count * repack->base_dst_size
+ + repack->extra_dst_size;
+
+ if (repack->packet_size < new_size) {
+ repack->packet_buffer = brealloc(
+ repack->packet_buffer, new_size);
+ if (!repack->packet_buffer)
+ return -1;
+
+ repack->packet_size = new_size;
+ }
+
+ return 0;
+}
+
+/*
+ Swap channel between LFE and FC, and
+ squash data array
+
+ | FL | FR |LFE | FC | BL | BR |emp |emp |
+ | | x | |
+ | FL | FR | FC |LFE | BL | BR |
+ */
+int repack_8to6ch_swap23(struct audio_repack *repack,
+ const uint8_t *bsrc, uint32_t frame_count)
+{
+ if (check_buffer(repack, frame_count) < 0)
+ return -1;
+
+ const uint32_t size = frame_count * repack->base_src_size;
+
+ const __m128i *src = (__m128i *)bsrc;
+ const __m128i *esrc = src + frame_count;
+ uint32_t *dst = (uint32_t *)repack->packet_buffer;
+ while (src != esrc) {
+ __m128i target = _mm_load_si128(src++);
+ __m128i buf = _mm_shufflelo_epi16(target, _MM_SHUFFLE(2, 3, 1, 0));
+ _mm_storeu_si128((__m128i *)dst, buf);
+ dst += 3;
+ }
+
+ return 0;
+}
+
+/*
+ Swap channel between LFE and FC
+
+ | FL | FR |LFE | FC | BL | BR |SBL |SBR |
+ | | x | | | |
+ | FL | FR | FC |LFE | BL | BR |SBL |SBR |
+ */
+int repack_8ch_swap23(struct audio_repack *repack,
+ const uint8_t *bsrc, uint32_t frame_count)
+{
+ if (check_buffer(repack, frame_count) < 0)
+ return -1;
+
+ const uint32_t size = frame_count * repack->base_src_size;
+
+ const __m128i *src = (__m128i *)bsrc;
+ const __m128i *esrc = src + frame_count;
+ __m128i *dst = (__m128i *)repack->packet_buffer;
+ while (src != esrc) {
+ __m128i target = _mm_load_si128(src++);
+ __m128i buf = _mm_shufflelo_epi16(target, _MM_SHUFFLE(2, 3, 1, 0));
+ _mm_store_si128(dst++, buf);
+ }
+
+ return 0;
+}
+
+int audio_repack_init(struct audio_repack *repack,
+ audio_repack_mode_t repack_mode, uint8_t sample_bit)
+{
+ memset(repack, 0, sizeof(*repack));
+
+ if (sample_bit != 16)
+ return -1;
+
+ switch (repack_mode) {
+ case repack_mode_8to6ch_swap23:
+ repack->base_src_size = 8 * (16 / 8);
+ repack->base_dst_size = 6 * (16 / 8);
+ repack->extra_dst_size = 2;
+ repack->repack_func = &repack_8to6ch_swap23;
+ break;
+
+ case repack_mode_8ch_swap23:
+ repack->base_src_size = 8 * (16 / 8);
+ repack->base_dst_size = 8 * (16 / 8);
+ repack->extra_dst_size = 0;
+ repack->repack_func = &repack_8ch_swap23;
+ break;
+
+ default: return -1;
+ }
+
+ return 0;
+}
+
+void audio_repack_free(struct audio_repack *repack)
+{
+ if (repack->packet_buffer)
+ bfree(repack->packet_buffer);
+
+ memset(repack, 0, sizeof(*repack));
+}
obs-studio-18.0.2.tar.xz/plugins/decklink/audio-repack.h
Added
+#pragma once
+
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+#include <stdint.h>
+#include <string.h>
+
+#include <obs.h>
+
+struct audio_repack;
+
+typedef int (*audio_repack_func_t)(struct audio_repack *,
+ const uint8_t *, uint32_t);
+
+struct audio_repack {
+ uint8_t *packet_buffer;
+ uint32_t packet_size;
+
+ uint32_t base_src_size;
+ uint32_t base_dst_size;
+ uint32_t extra_dst_size;
+
+ audio_repack_func_t repack_func;
+};
+
+enum _audio_repack_mode {
+ repack_mode_8to6ch_swap23,
+ repack_mode_8ch_swap23,
+};
+
+typedef enum _audio_repack_mode audio_repack_mode_t;
+
+extern int audio_repack_init(struct audio_repack *repack,
+ audio_repack_mode_t repack_mode, uint8_t sample_bit);
+extern void audio_repack_free(struct audio_repack *repack);
+
+#ifdef __cplusplus
+}
+#endif
obs-studio-18.0.2.tar.xz/plugins/decklink/audio-repack.hpp
Added
+#pragma once
+
+#include "audio-repack.h"
+
+class AudioRepacker {
+ struct audio_repack arepack;
+
+public:
+ inline AudioRepacker(audio_repack_mode_t repack_mode)
+ {
+ audio_repack_init(&arepack, repack_mode, 16);
+ }
+ inline ~AudioRepacker()
+ {
+ audio_repack_free(&arepack);
+ }
+
+ inline int repack(const uint8_t *src, uint32_t frame_size)
+ {
+ return (*arepack.repack_func)(&arepack, src, frame_size);
+ }
+
+ inline operator struct audio_repack*() {return &arepack;}
+ inline struct audio_repack *operator->() {return &arepack;}
+};
obs-studio-18.0.1.tar.xz/plugins/decklink/data/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/plugins/decklink/data/locale/en-US.ini
Changed
Mode="Mode"
Buffering="Use Buffering"
PixelFormat="Pixel Format"
+ChannelFormat="Channel"
+ChannelFormat.None="None"
+ChannelFormat.2_0ch="2ch"
+ChannelFormat.5_1ch="5.1ch"
+ChannelFormat.5_1chBack="5.1ch (Back)"
+ChannelFormat.7_1ch="7.1ch"
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink-device-instance.cpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink-device-instance.cpp
Changed
#include "decklink-device-instance.hpp"
+#include "audio-repack.hpp"
#include <util/platform.h>
#include <util/threading.h>
#define LOG(level, message, ...) blog(level, "%s: " message, \
obs_source_get_name(this->decklink->GetSource()), ##__VA_ARGS__)
+#define ISSTEREO(flag) ((flag) == SPEAKERS_STEREO)
+
static inline enum video_format ConvertPixelFormat(BMDPixelFormat format)
{
switch (format) {
return VIDEO_FORMAT_UYVY;
}
+static inline int ConvertChannelFormat(speaker_layout format)
+{
+ switch (format) {
+ case SPEAKERS_5POINT1:
+ case SPEAKERS_5POINT1_SURROUND:
+ case SPEAKERS_7POINT1:
+ return 8;
+
+ default:
+ case SPEAKERS_STEREO:
+ return 2;
+ }
+}
+
+static inline audio_repack_mode_t ConvertRepackFormat(speaker_layout format)
+{
+ switch (format) {
+ case SPEAKERS_5POINT1:
+ case SPEAKERS_5POINT1_SURROUND:
+ return repack_mode_8to6ch_swap23;
+
+ case SPEAKERS_7POINT1:
+ return repack_mode_8ch_swap23;
+
+ default:
+ assert(false && "No repack requested");
+ return (audio_repack_mode_t)-1;
+ }
+}
+
DeckLinkDeviceInstance::DeckLinkDeviceInstance(DeckLink *decklink_,
DeckLinkDevice *device_) :
currentFrame(), currentPacket(), decklink(decklink_), device(device_)
return;
}
- currentPacket.data[0] = (uint8_t *)bytes;
- currentPacket.frames = (uint32_t)audioPacket->GetSampleFrameCount();
- currentPacket.timestamp = timestamp;
+ const uint32_t frameCount = (uint32_t)audioPacket->GetSampleFrameCount();
+ currentPacket.frames = frameCount;
+ currentPacket.timestamp = timestamp;
+
+ if (!ISSTEREO(channelFormat)) {
+ if (audioRepacker->repack((uint8_t *)bytes, frameCount) < 0) {
+ LOG(LOG_ERROR, "Failed to convert audio packet data");
+ return;
+ }
+
+ currentPacket.data[0] = (*audioRepacker)->packet_buffer;
+ } else {
+ currentPacket.data[0] = (uint8_t *)bytes;
+ }
+
+ nextAudioTS = timestamp +
+ ((uint64_t)frameCount * 1000000000ULL / 48000ULL) + 1;
obs_source_output_audio(decklink->GetSource(), ¤tPacket);
}
obs_source_output_video(decklink->GetSource(), ¤tFrame);
}
+void DeckLinkDeviceInstance::FinalizeStream()
+{
+ input->SetCallback(nullptr);
+
+ if (audioRepacker != nullptr)
+ {
+ delete audioRepacker;
+ audioRepacker = nullptr;
+ }
+
+ mode = nullptr;
+}
+
bool DeckLinkDeviceInstance::StartCapture(DeckLinkDeviceMode *mode_)
{
if (mode != nullptr)
pixelFormat = decklink->GetPixelFormat();
currentFrame.format = ConvertPixelFormat(pixelFormat);
- input->SetCallback(this);
-
const BMDDisplayMode displayMode = mode_->GetDisplayMode();
const HRESULT videoResult = input->EnableVideoInput(displayMode,
if (videoResult != S_OK) {
LOG(LOG_ERROR, "Failed to enable video input");
- input->SetCallback(nullptr);
return false;
}
- const HRESULT audioResult = input->EnableAudioInput(
- bmdAudioSampleRate48kHz, bmdAudioSampleType16bitInteger,
- 2);
+ channelFormat = decklink->GetChannelFormat();
+ currentPacket.speakers = channelFormat;
+
+ if (channelFormat != SPEAKERS_UNKNOWN) {
+ const int channel = ConvertChannelFormat(channelFormat);
+ const HRESULT audioResult = input->EnableAudioInput(
+ bmdAudioSampleRate48kHz, bmdAudioSampleType16bitInteger,
+ channel);
- if (audioResult != S_OK)
- LOG(LOG_WARNING, "Failed to enable audio input; continuing...");
+ if (audioResult != S_OK)
+ LOG(LOG_WARNING, "Failed to enable audio input; continuing...");
+
+ if (!ISSTEREO(channelFormat)) {
+ const audio_repack_mode_t repack_mode = ConvertRepackFormat(channelFormat);
+ audioRepacker = new AudioRepacker(repack_mode);
+ }
+ }
+
+ if (input->SetCallback(this) != S_OK) {
+ LOG(LOG_ERROR, "Failed to set callback");
+ FinalizeStream();
+ return false;
+ }
if (input->StartStreams() != S_OK) {
LOG(LOG_ERROR, "Failed to start streams");
- input->SetCallback(nullptr);
- input->DisableVideoInput();
- input->DisableAudioInput();
+ FinalizeStream();
return false;
}
GetDevice()->GetDisplayName().c_str());
input->StopStreams();
- input->SetCallback(nullptr);
- input->DisableVideoInput();
- input->DisableAudioInput();
-
- mode = nullptr;
+ FinalizeStream();
return true;
}
BMDTimeValue videoDur = 0;
BMDTimeValue audioTS = 0;
- if (videoFrame)
+ if (videoFrame) {
videoFrame->GetStreamTime(&videoTS, &videoDur, TIME_BASE);
- if (audioPacket)
- audioPacket->GetPacketTime(&audioTS, TIME_BASE);
+ lastVideoTS = (uint64_t)videoTS;
+ }
+ if (audioPacket) {
+ BMDTimeValue newAudioTS = 0;
+ int64_t diff;
+
+ audioPacket->GetPacketTime(&newAudioTS, TIME_BASE);
+ audioTS = newAudioTS + audioOffset;
+
+ diff = (int64_t)audioTS - (int64_t)nextAudioTS;
+ if (diff > 10000000LL) {
+ audioOffset -= diff;
+ audioTS = newAudioTS + audioOffset;
+
+ } else if (diff < -1000000) {
+ audioOffset = 0;
+ audioTS = newAudioTS;
+ }
+ }
if (videoFrame && videoTS >= 0)
HandleVideoFrame(videoFrame, (uint64_t)videoTS);
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink-device-instance.hpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink-device-instance.hpp
Changed
#include "decklink-device.hpp"
+class AudioRepacker;
+
class DeckLinkDeviceInstance : public IDeckLinkInputCallback {
protected:
struct obs_source_frame currentFrame;
BMDPixelFormat pixelFormat = bmdFormat8BitYUV;
ComPtr<IDeckLinkInput> input;
volatile long refCount = 1;
+ int64_t audioOffset = 0;
+ uint64_t nextAudioTS = 0;
+ uint64_t lastVideoTS = 0;
+ AudioRepacker *audioRepacker = nullptr;
+ speaker_layout channelFormat = SPEAKERS_STEREO;
+
+ void FinalizeStream();
void HandleAudioPacket(IDeckLinkAudioInputPacket *audioPacket,
const uint64_t timestamp);
}
inline BMDPixelFormat GetActivePixelFormat() const {return pixelFormat;}
+ inline speaker_layout GetActiveChannelFormat() const {return channelFormat;}
inline DeckLinkDeviceMode *GetMode() const {return mode;}
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink-device.cpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink-device.cpp
Changed
if (result != S_OK)
return true;
+ int64_t channels;
+ /* Intensity Shuttle for Thunderbolt return 2; however, it supports 8 channels */
+ if (name == "Intensity Shuttle Thunderbolt")
+ maxChannel = 8;
+ else if (attributes->GetInt(BMDDeckLinkMaximumAudioChannels, &channels) == S_OK)
+ maxChannel = (int32_t)channels;
+ else
+ maxChannel = 2;
+
/* http://forum.blackmagicdesign.com/viewtopic.php?f=12&t=33967
* BMDDeckLinkTopologicalID for older devices
* BMDDeckLinkPersistentID for newer ones */
{
return name;
}
+
+const int32_t DeckLinkDevice::GetMaxChannel(void) const
+{
+ return maxChannel;
+}
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink-device.hpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink-device.hpp
Changed
std::string name;
std::string displayName;
std::string hash;
+ int32_t maxChannel;
volatile long refCount = 1;
public:
const std::string& GetHash(void) const;
const std::vector<DeckLinkDeviceMode *>& GetModes(void) const;
const std::string& GetName(void) const;
+ const int32_t GetMaxChannel(void) const;
bool GetInput(IDeckLinkInput **input);
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink.cpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink.cpp
Changed
if (!isActive)
return false;
if (instance->GetActiveModeId() == modeId &&
- instance->GetActivePixelFormat() == pixelFormat)
+ instance->GetActivePixelFormat() == pixelFormat &&
+ instance->GetActiveChannelFormat() == channelFormat)
return false;
}
obs-studio-18.0.1.tar.xz/plugins/decklink/decklink.hpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/decklink.hpp
Changed
volatile long activateRefs = 0;
std::recursive_mutex deviceMutex;
BMDPixelFormat pixelFormat = bmdFormat8BitYUV;
+ speaker_layout channelFormat = SPEAKERS_STEREO;
void SaveSettings();
static void DevicesChanged(void *param, DeckLinkDevice *device,
{
pixelFormat = format;
}
+ inline speaker_layout GetChannelFormat() const {return channelFormat;}
+ inline void SetChannelFormat(speaker_layout format)
+ {
+ channelFormat = format;
+ }
bool Activate(DeckLinkDevice *device, long long modeId);
void Deactivate();
obs-studio-18.0.1.tar.xz/plugins/decklink/linux/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/decklink/linux/CMakeLists.txt
Changed
../decklink-device-discovery.hpp
../decklink-device.hpp
../decklink-device-mode.hpp
+ ../audio-repack.h
+ ../audio-repack.hpp
)
set(linux-decklink_SOURCES
../decklink-device-discovery.cpp
../decklink-device.cpp
../decklink-device-mode.cpp
+ ../audio-repack.c
platform.cpp)
add_library(linux-decklink MODULE
obs-studio-18.0.1.tar.xz/plugins/decklink/mac/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/decklink/mac/CMakeLists.txt
Changed
../decklink-device-discovery.hpp
../decklink-device.hpp
../decklink-device-mode.hpp
+ ../audio-repack.h
+ ../audio-repack.hpp
)
set(mac-decklink_SOURCES
../decklink-device-discovery.cpp
../decklink-device.cpp
../decklink-device-mode.cpp
+ ../audio-repack.c
platform.cpp)
add_library(mac-decklink MODULE
obs-studio-18.0.1.tar.xz/plugins/decklink/plugin-main.cpp -> obs-studio-18.0.2.tar.xz/plugins/decklink/plugin-main.cpp
Changed
OBS_DECLARE_MODULE()
OBS_MODULE_USE_DEFAULT_LOCALE("decklink", "en-US")
+#define DEVICE_HASH "device_hash"
+#define DEVICE_NAME "device_name"
+#define MODE_ID "mode_id"
+#define MODE_NAME "mode_name"
+#define CHANNEL_FORMAT "channel_format"
+#define PIXEL_FORMAT "pixel_format"
+#define BUFFERING "buffering"
+
+#define TEXT_DEVICE obs_module_text("Device")
+#define TEXT_MODE obs_module_text("Mode")
+#define TEXT_PIXEL_FORMAT obs_module_text("PixelFormat")
+#define TEXT_CHANNEL_FORMAT obs_module_text("ChannelFormat")
+#define TEXT_CHANNEL_FORMAT_NONE obs_module_text("ChannelFormat.None")
+#define TEXT_CHANNEL_FORMAT_2_0CH obs_module_text("ChannelFormat.2_0ch")
+#define TEXT_CHANNEL_FORMAT_5_1CH obs_module_text("ChannelFormat.5_1ch")
+#define TEXT_CHANNEL_FORMAT_5_1CH_BACK obs_module_text("ChannelFormat.5_1chBack")
+#define TEXT_CHANNEL_FORMAT_7_1CH obs_module_text("ChannelFormat.7_1ch")
+#define TEXT_BUFFERING obs_module_text("Buffering")
+
static DeckLinkDeviceDiscovery *deviceEnum = nullptr;
static void decklink_enable_buffering(DeckLink *decklink, bool enabled)
DeckLink *decklink = new DeckLink(source, deviceEnum);
decklink_enable_buffering(decklink,
- obs_data_get_bool(settings, "buffering"));
+ obs_data_get_bool(settings, BUFFERING));
obs_source_update(source, settings);
return decklink;
static void decklink_update(void *data, obs_data_t *settings)
{
DeckLink *decklink = (DeckLink *)data;
- const char *hash = obs_data_get_string(settings, "device_hash");
- long long id = obs_data_get_int(settings, "mode_id");
- BMDPixelFormat format = (BMDPixelFormat)obs_data_get_int(settings,
- "pixel_format");
+ const char *hash = obs_data_get_string(settings, DEVICE_HASH);
+ long long id = obs_data_get_int(settings, MODE_ID);
+ BMDPixelFormat pixelFormat = (BMDPixelFormat)obs_data_get_int(settings,
+ PIXEL_FORMAT);
+ speaker_layout channelFormat = (speaker_layout)obs_data_get_int(settings,
+ CHANNEL_FORMAT);
decklink_enable_buffering(decklink,
- obs_data_get_bool(settings, "buffering"));
+ obs_data_get_bool(settings, BUFFERING));
ComPtr<DeckLinkDevice> device;
device.Set(deviceEnum->FindByHash(hash));
- decklink->SetPixelFormat(format);
+ decklink->SetPixelFormat(pixelFormat);
+ decklink->SetChannelFormat(channelFormat);
decklink->Activate(device, id);
}
static void decklink_get_defaults(obs_data_t *settings)
{
- obs_data_set_default_bool(settings, "buffering", true);
- obs_data_set_default_int(settings, "pixel_format", bmdFormat8BitYUV);
+ obs_data_set_default_bool(settings, BUFFERING, true);
+ obs_data_set_default_int(settings, PIXEL_FORMAT, bmdFormat8BitYUV);
+ obs_data_set_default_int(settings, CHANNEL_FORMAT, SPEAKERS_STEREO);
}
static const char *decklink_get_name(void*)
static bool decklink_device_changed(obs_properties_t *props,
obs_property_t *list, obs_data_t *settings)
{
- const char *name = obs_data_get_string(settings, "device_name");
- const char *hash = obs_data_get_string(settings, "device_hash");
- const char *mode = obs_data_get_string(settings, "mode_name");
- long long modeId = obs_data_get_int(settings, "mode_id");
+ const char *name = obs_data_get_string(settings, DEVICE_NAME);
+ const char *hash = obs_data_get_string(settings, DEVICE_HASH);
+ const char *mode = obs_data_get_string(settings, MODE_NAME);
+ long long modeId = obs_data_get_int(settings, MODE_ID);
size_t itemCount = obs_property_list_item_count(list);
bool itemFound = false;
obs_property_list_item_disable(list, 0, true);
}
- list = obs_properties_get(props, "mode_id");
+ obs_property_t *modeList = obs_properties_get(props, MODE_ID);
+ obs_property_t *channelList = obs_properties_get(props, CHANNEL_FORMAT);
+
+ obs_property_list_clear(modeList);
- obs_property_list_clear(list);
+ obs_property_list_clear(channelList);
+ obs_property_list_add_int(channelList, TEXT_CHANNEL_FORMAT_NONE,
+ SPEAKERS_UNKNOWN);
+ obs_property_list_add_int(channelList, TEXT_CHANNEL_FORMAT_2_0CH,
+ SPEAKERS_STEREO);
ComPtr<DeckLinkDevice> device;
device.Set(deviceEnum->FindByHash(hash));
if (!device) {
- obs_property_list_add_int(list, mode, modeId);
- obs_property_list_item_disable(list, 0, true);
+ obs_property_list_add_int(modeList, mode, modeId);
+ obs_property_list_item_disable(modeList, 0, true);
} else {
const std::vector<DeckLinkDeviceMode*> &modes =
device->GetModes();
for (DeckLinkDeviceMode *mode : modes) {
- obs_property_list_add_int(list,
+ obs_property_list_add_int(modeList,
mode->GetName().c_str(),
mode->GetId());
}
+
+ if (device->GetMaxChannel() >= 8) {
+ obs_property_list_add_int(channelList, TEXT_CHANNEL_FORMAT_5_1CH,
+ SPEAKERS_5POINT1);
+ obs_property_list_add_int(channelList, TEXT_CHANNEL_FORMAT_5_1CH_BACK,
+ SPEAKERS_5POINT1_SURROUND);
+ obs_property_list_add_int(channelList, TEXT_CHANNEL_FORMAT_7_1CH,
+ SPEAKERS_7POINT1);
+ }
}
return true;
{
obs_properties_t *props = obs_properties_create();
- obs_property_t *list = obs_properties_add_list(props, "device_hash",
- obs_module_text("Device"), OBS_COMBO_TYPE_LIST,
- OBS_COMBO_FORMAT_STRING);
+ obs_property_t *list = obs_properties_add_list(props, DEVICE_HASH,
+ TEXT_DEVICE, OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_STRING);
obs_property_set_modified_callback(list, decklink_device_changed);
fill_out_devices(list);
- list = obs_properties_add_list(props, "mode_id",
- obs_module_text("Mode"), OBS_COMBO_TYPE_LIST,
- OBS_COMBO_FORMAT_INT);
+ list = obs_properties_add_list(props, MODE_ID, TEXT_MODE,
+ OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- list = obs_properties_add_list(props, "pixel_format",
- obs_module_text("PixelFormat"), OBS_COMBO_TYPE_LIST,
+ list = obs_properties_add_list(props, PIXEL_FORMAT,
+ TEXT_PIXEL_FORMAT, OBS_COMBO_TYPE_LIST,
OBS_COMBO_FORMAT_INT);
-
obs_property_list_add_int(list, "8-bit YUV", bmdFormat8BitYUV);
obs_property_list_add_int(list, "8-bit BGRA", bmdFormat8BitBGRA);
- obs_properties_add_bool(props, "buffering",
- obs_module_text("Buffering"));
+ list = obs_properties_add_list(props, CHANNEL_FORMAT,
+ TEXT_CHANNEL_FORMAT, OBS_COMBO_TYPE_LIST,
+ OBS_COMBO_FORMAT_INT);
+ obs_property_list_add_int(list, TEXT_CHANNEL_FORMAT_NONE,
+ SPEAKERS_UNKNOWN);
+ obs_property_list_add_int(list, TEXT_CHANNEL_FORMAT_2_0CH,
+ SPEAKERS_STEREO);
+
+ obs_properties_add_bool(props, BUFFERING, TEXT_BUFFERING);
UNUSED_PARAMETER(data);
return props;
obs-studio-18.0.1.tar.xz/plugins/decklink/win/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/decklink/win/CMakeLists.txt
Changed
../decklink-device-discovery.hpp
../decklink-device.hpp
../decklink-device-mode.hpp
+ ../audio-repack.h
+ ../audio-repack.hpp
)
set(win-decklink_SOURCES
../decklink-device-discovery.cpp
../decklink-device.cpp
../decklink-device-mode.cpp
+ ../audio-repack.c
platform.cpp)
add_idl_files(win-decklink-sdk_GENERATED_FILES
obs-studio-18.0.1.tar.xz/plugins/enc-amf/#Resources/Installer.in.iss -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/#Resources/Installer.in.iss
Changed
; SEE THE DOCUMENTATION FOR DETAILS ON CREATING INNO SETUP SCRIPT FILES!
#define MyAppName "AMD Encoder for OBS Studio"
-#define MyAppVersion "@enc-amf_VERSION_MAJOR@.@enc-amf_VERSION_MINOR@.@enc-amf_VERSION_PATCH@.@enc-amf_VERSION_BUILD@"
+#define MyAppVersion "@enc-amf_VERSION_MAJOR@.@enc-amf_VERSION_MINOR@.@enc-amf_VERSION_PATCH@"
#define MyAppPublisher "Xaymars Technology Workshop"
#define MyAppURL "http://www.xaymar.com/portfolio/plugin-amd-vce-plugin-for-obs-studio/"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/#Resources/PATCH_NOTES.md -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/#Resources/PATCH_NOTES.md
Changed
-# 1.4.3.8 - Settings Transfer, Automatic VBV Buffer adjustment and Fixes (Hotfix 2)
-Another day, another new feature: this time it is transferring settings between versions, so that you will no longer use settings when a change to a setting is made. Since it only just now started tracking the config version, it will only work with settings created between 1.4.3.0 and 1.4.3.5, any other version might end up with broken settings.
+# 2.0.0.0 - The 'Ashes of the Phoenix' Update
+The plugin rises from the no-update-phase back to bring you an even better experience with High Efficiency Video Coding (HEVC) encoding, Variance Based Adaptive Quantitization (VBAQ), Pre-Pass support and Asynchronous Queue mode.
-Another change has been done to the Automatic VBV Buffer Size, which will now behave much more predictable. A value of 0% is completely unrestricted, 50% matches the calculated bitrate and 100% matches the calculated strict bitrate.
+High Efficiency Video Coding (H265/HEVC) is available on the Polaris architecture and offers massively better quality for the same bitrate, essentially resulting in lower bitrates looking much better. This means that at 1000 kbit H265/HEVC you can get close or surpass the quality of 2500 kbit H264/AVC in many types of scenes and motion. Unfortunately it never took off as it only got slightly better quality than VP9 and AV1 is already beating it in terms of speed and quality.
-Presets will also now use the proper minimum and maximum QP values and the minimum QP default value has been increased to 11.
+Variance Based Adaptive Quantitization (VBAQ) and Pre-Pass are both methods to better distribute the Bitrate in a given frame. VBAQ works on the principle of visual perception, while Pre-Pass looks at which areas need more Bitrate to not end up being blocky. Enabling both will result in a much better output with no change in Bitrate.
-Hotfix 1: Fix enumeration based properties not working correctly due to a programming error.
-Hotfix 2: Actually fix the enumeration based properties for real this time.
+Asynchronous Queue is a new feature that used to be the standard behaviour in earlier versions. Since no two CPUs are the same, Asynchronous Queue offers a way to use multiple cores of the CPU for the encoding task, instead of just handling everything on a single core. This feature is in very early stages, so it will probably cause issues unless absolutely needed.
## Changelog
-* Added: Version-specific setting transfer code which should reduce the lost settings between updates.
-* Changed: VBV Buffer Strictness is now linear with three steps: 100000 (0%), Target Bitrate (50%) and Strict Target Bitrate (100%).
-* Changed: Default for Minimum QP is now 11.
-* Fixed: Presets not using the proper QP Minimum and Maximum.
-* Fixed: Startup log messages not showing proper error codes.
-* Hotfix: Fix enumeration based properties not using the correct values.
-* Hotfix: Fix the default value for B-Frame Pattern being '-1' due to an oversight in code.
\ No newline at end of file
+* Redesigned the internal structure to be much faster for much less CPU usage.
+* Fixed several object lifetime issues that were only visible while debugging.
+* Massively improved capability testing which now allows us to see the exact limits of the encoder.
+* Fixed a crash-to-desktop on closing OBS.
+* Added H264/AVC and H265/HEVC encoders.
+* Slightly redesigned the UI to further improve quality of life.
+* Removed 'OpenGL' and 'Host' entries from the Video API field as they aren't actual APIs. (#216)
+* Removed the useless 'Usage' field, which was causing a lot of PEBKAC issues. (#210)
+* Added the ability to use Keyframe Interval in 'Master' View Mode.
+* Updated preset 'High Quality' to use a QP value of 18/18/18.
+* Updated preset 'Indistinguishable' to use a QP value of 15/15/15.
+* Fixed a crash with 'Automatic' VBV Buffer while using 'Constant QP'.
+* Fixed 'Filler Data' and 'Enforce HRD' not working properly at all times. (#215)
+* Changed the behaviour of 'Automatic' VBV Buffer to be linear with 'Constant QP'.
+* Massively improved output timestamps.
+* Fired Steve.
+* Fixed a rare crash that could happen with certain translations.
+* Changed the default for 'VBAQ' to 'Enabled' for H264 and H265.
+* Added an Asynchronous Queue mode which enables safe multi-threading for slightly higher CPU usage and encoding latency. (#211)
+* Split the 'OpenCL' field into 'OpenCL Transfer' (Use OpenCL to send the frame to the GPU?) and 'OpenCL Conversion' (Use OpenCL instead of DirectCompute for frame conversion?). (#212, #214)
+* Fixed certain presets permanently locking the Keyframe Interval and IDR Period to low numbers. (#213)
+* Improved Performance Tracking which is visible when starting OBS with --verbose --log_unfiltered.
obs-studio-18.0.1.tar.xz/plugins/enc-amf/#Resources/package.in.bat -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/#Resources/package.in.bat
Changed
@ECHO OFF
REM Shut up, I'm just as lazy as anyone else.
SET "CURDIR=%~dp0"
-SET "FINALNAME=AMD-Encoder-for-OBS-Studio.@enc-amf_VERSION_MAJOR@.@enc-amf_VERSION_MINOR@.@enc-amf_VERSION_PATCH@.@enc-amf_VERSION_BUILD@"
+SET "FINALNAME=AMD-Encoder-for-OBS-Studio.@enc-amf_VERSION_MAJOR@.@enc-amf_VERSION_MINOR@.@enc-amf_VERSION_PATCH@"
SET "SevenZip=C:\Program Files\7-Zip\7z.exe"
SET "InnoSetup=C:\Program Files (x86)\Inno Setup 5\Compil32.exe"
obs-studio-18.0.2.tar.xz/plugins/enc-amf/.gitattributes
Added
+###############################################################################
+# Set default behavior to automatically normalize line endings.
+###############################################################################
+* text=auto
+
+###############################################################################
+# Set default behavior for command prompt diff.
+#
+# This is need for earlier builds of msysgit that does not have it on by
+# default for csharp files.
+# Note: This is only used by command line
+###############################################################################
+#*.cs diff=csharp
+
+###############################################################################
+# Set the merge driver for project and solution files
+#
+# Merging from the command prompt will add diff markers to the files if there
+# are conflicts (Merging from VS is not affected by the settings below, in VS
+# the diff markers are never inserted). Diff markers may cause the following
+# file extensions to fail to load in VS. An alternative would be to treat
+# these files as binary and thus will always conflict and require user
+# intervention with every merge. To do so, just uncomment the entries below
+###############################################################################
+#*.sln merge=binary
+#*.csproj merge=binary
+#*.vbproj merge=binary
+#*.vcxproj merge=binary
+#*.vcproj merge=binary
+#*.dbproj merge=binary
+#*.fsproj merge=binary
+#*.lsproj merge=binary
+#*.wixproj merge=binary
+#*.modelproj merge=binary
+#*.sqlproj merge=binary
+#*.wwaproj merge=binary
+
+###############################################################################
+# behavior for image files
+#
+# image files are treated as binary by default.
+###############################################################################
+#*.jpg binary
+#*.png binary
+#*.gif binary
+
+###############################################################################
+# diff behavior for common document formats
+#
+# Convert binary document formats to text before diffing them. This feature
+# is only available from the command line. Turn it on by uncommenting the
+# entries below.
+###############################################################################
+#*.doc diff=astextplain
+#*.DOC diff=astextplain
+#*.docx diff=astextplain
+#*.DOCX diff=astextplain
+#*.dot diff=astextplain
+#*.DOT diff=astextplain
+#*.pdf diff=astextplain
+#*.PDF diff=astextplain
+#*.rtf diff=astextplain
+#*.RTF diff=astextplain
obs-studio-18.0.2.tar.xz/plugins/enc-amf/.mailmap
Added
+Jim <obs.jim@gmail.com>
+Michael Fabian Dirks <info@xaymar.com> <michael.dirks@xaymar.com>
obs-studio-18.0.2.tar.xz/plugins/enc-amf/AUTHORS
Added
+Contributors:
+- Michael Fabian 'Xaymar' Dirks <michael.dirks@xaymar.com>
+- max20091
+- Marcos Vidal Martinez
+- Viacheslav
+- jackun
+- jp9000
+- nwgat
+- wazer
+- Horváth Dániel
+- M4RK22
+- Richard Stanway
+
+Patrons on Patreon (https://patreon.com/xaymar/):
+- Anaz Haidhar
+- AJ
+- Benjamin Hoffmeister
+- Bo
+- Bryan Furia
+- DaOrgest
+- Dominik Roth
+- Jeremy "razorlikes" Nieth
+- Kristian Kirkesæther
+- Kuo Shih
+- Kytos
+- Nicholas Kreimeyer
+- Nico Thate
+- NoxiousPluK
+- nwgat.ninja
+- Oldgam3r
+- Omega Drik Mage
+- prefixs
+- Rene "vDex" Dirks
+- shiny
+- Simon Vacker
+- SneakyJoe
+- Spikeypup
+- Vinicius Guilherme
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/CMakeLists.txt
Changed
cmake_minimum_required(VERSION 2.8.12)
-
-# Shared (OBS Studio & Standalone)
PROJECT(enc-amf)
+
+################################################################################
+# Version
+################################################################################
+SET(enc-amf_VERSION_MAJOR 2)
+SET(enc-amf_VERSION_MINOR 1)
+SET(enc-amf_VERSION_PATCH 0)
+configure_file(
+ "${PROJECT_SOURCE_DIR}/#Resources/package.in.bat"
+ "${PROJECT_SOURCE_DIR}/#Resources/package.bat"
+)
+configure_file(
+ "${PROJECT_SOURCE_DIR}/#Resources/Installer.in.iss"
+ "${PROJECT_SOURCE_DIR}/#Resources/Installer.iss"
+)
+configure_file(
+ "${PROJECT_SOURCE_DIR}/Include/Version.h.in"
+ "${PROJECT_BINARY_DIR}/Include/Version.h"
+)
+
+################################################################################
+# Code
+################################################################################
SET(enc-amf_HEADERS
"Include/amf.h"
"Include/amf-capabilities.h"
- "Include/amf-h264.h"
+ "Include/amf-encoder.h"
"Include/api-base.h"
- "Include/api-d3d9.h"
- "Include/api-d3d11.h"
"Include/api-host.h"
"Include/api-opengl.h"
- "Include/enc-h264.h"
+ "Include/utility.h"
"Include/plugin.h"
+ "Include/strings.h"
"${PROJECT_BINARY_DIR}/Include/Version.h"
)
SET(enc-amf_SOURCES
"Source/amf.cpp"
"Source/amf-capabilities.cpp"
- "Source/amf-h264.cpp"
+ "Source/amf-encoder.cpp"
"Source/api-base.cpp"
"Source/api-host.cpp"
- "Source/api-d3d9.cpp"
- "Source/api-d3d11.cpp"
"Source/api-opengl.cpp"
- "Source/enc-h264.cpp"
- "Source/misc-util.cpp"
+ "Source/utility.cpp"
"Source/plugin.cpp"
)
SET(enc-amf_LIBRARIES
winmm
)
-# Version
-SET(enc-amf_VERSION_MAJOR 1)
-SET(enc-amf_VERSION_MINOR 4)
-SET(enc-amf_VERSION_PATCH 3)
-SET(enc-amf_VERSION_BUILD 11)
-configure_file(
- "${PROJECT_SOURCE_DIR}/#Resources/package.in.bat"
- "${PROJECT_SOURCE_DIR}/#Resources/package.bat"
-)
-configure_file(
- "${PROJECT_SOURCE_DIR}/#Resources/Installer.in.iss"
- "${PROJECT_SOURCE_DIR}/#Resources/Installer.iss"
-)
-configure_file(
- "${PROJECT_SOURCE_DIR}/Include/Version.h.in"
- "${PROJECT_BINARY_DIR}/Include/Version.h"
-)
+# Windows Only
+if (WIN32)
+ LIST(APPEND enc-amf_HEADERS
+ "Include/api-d3d9.h"
+ "Include/api-d3d11.h"
+ )
+ LIST(APPEND enc-amf_SOURCES
+ "Source/api-d3d9.cpp"
+ "Source/api-d3d11.cpp"
+ )
+endif()
-# OBS Studio Specific
+################################################################################
+# Standalone and OBS Studio Build Data
+################################################################################
if(BUILD_AMF_ENCODER)
+ # OBS Studio Specific
+
+ # Variables
OPTION(AMDAMF_Disable "Disable AMD Advanced Media Framework support" OFF)
SET(AMDAMF_SDKDir "" CACHE PATH "AMD Advanced Media Framework SDK Directory")
+ # Tests
if(AMDAMF_Disable)
message(STATUS "AMD AMF support disabled")
return()
return()
endif()
+ # Directories
INCLUDE_DIRECTORIES(
"${CMAKE_SOURCE_DIR}"
"${PROJECT_BINARY_DIR}"
)
SET(LIBOBS_LIBRARIES libobs)
else()
-# Standlone Specific
+ # Standlone Specific
# Variables
SET(PATH_AMDAMFSDK "" CACHE PATH "AMD Advanced Media Framework SDK Directory")
SET(PATH_OBSStudio "" CACHE PATH "OBS Studio Source Code Directory")
- #SET(PATH_libobs "" CACHE PATH "Path to obs.lib from OBS Studio")
+ # Tests
if(PATH_AMDAMFSDK STREQUAL "")
message(STATUS "PATH_AMDAMFSDK not set!")
return()
return()
endif()
- # Stuff
+ # Find OBS Libraries
SET(obsPath "${PATH_OBSStudio}")
INCLUDE("${PATH_OBSStudio}/cmake/external/Findlibobs.cmake")
"${PATH_AMDAMFSDK}/amf/public/include"
"${PATH_OBSStudio}/"
)
+ add_definitions(-D_CRT_SECURE_NO_WARNINGS)
endif()
+################################################################################
+# Optional Features
+################################################################################
+# AVC Encoding
+OPTION(BUILD_AMF_AVC "AMF Plugin: Build AVC (H264) support" ON)
+
+if(BUILD_AMF_AVC)
+ add_definitions(-DWITH_AVC)
+ LIST(APPEND enc-amf_HEADERS
+ "Include/amf-encoder-h264.h"
+ "Include/enc-h264.h"
+ )
+ LIST(APPEND enc-amf_SOURCES
+ "Source/amf-encoder-h264.cpp"
+ "Source/enc-h264.cpp"
+ )
+endif()
+
+# HEVC Encoding
+OPTION(BUILD_AMF_HEVC "AMF Plugin: Build HEVC support" ON)
+if(BUILD_AMF_HEVC)
+ add_definitions(-DWITH_HEVC)
+ LIST(APPEND enc-amf_HEADERS
+ "Include/amf-encoder-h265.h"
+ "Include/enc-h265.h"
+ )
+ LIST(APPEND enc-amf_SOURCES
+ "Source/amf-encoder-h265.cpp"
+ "Source/enc-h265.cpp"
+ )
+endif()
+
+################################################################################
+# Build
+################################################################################
ADD_LIBRARY(enc-amf MODULE
${enc-amf_HEADERS}
${enc-amf_SOURCES}
${enc-amf_LIBRARIES}
)
+# All Warnings, Extra Warnings, Pedantic
+if(MSVC)
+ # Force to always compile with W4
+ if(CMAKE_CXX_FLAGS MATCHES "/W[0-4]")
+ string(REGEX REPLACE "/W[0-4]" "/W4" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
+ else()
+ set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /W4")
+ endif()
+elseif(CMAKE_COMPILER_IS_GNUCC OR CMAKE_COMPILER_IS_GNUCXX)
+ # Update if necessary
+ set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wno-long-long -pedantic")
+endif()
+
if(BUILD_AMF_ENCODER)
install_obs_plugin_with_data(enc-amf Resources)
else()
COMMAND ${CMAKE_COMMAND} -E copy
"$<TARGET_FILE:enc-amf>"
"${PROJECT_SOURCE_DIR}/#Build/obs-plugins/${BITS}bit/$<TARGET_FILE_NAME:enc-amf>"
- )
+ )
add_custom_command(TARGET enc-amf POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
"$<TARGET_FILE_DIR:enc-amf>/enc-amf.pdb"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/CONTRIBUTING.md -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/CONTRIBUTING.md
Changed
* AMF: amf.cpp, amf.h, amf-capabilities.cpp, amf-capabilities.h
* Encoder: amf-encoder.cpp, amf-encoder.h
-* H264: amf-h264.cpp, amf-h264.h, enc-h264.cpp, enc-h264.h
+* H264: amf-encoder-h264.cpp, amf-encoder-h264.h, enc-h264.cpp, enc-h264.h
+* H265: amf-encoder-h265.cpp, amf-encoder-h265.h, enc-h265.cpp, enc-h265.h
* API: api-base.cpp, api-base.h
* API-OpenGL: api-opengl.cpp, api-opengl.h
* API-Direct3D9: api-d3d9.cpp, api-d3d9.h
* API-Direct3D11: api-d3d11.cpp, api-d3d11.h
* API-Host: api-host.cpp, api-host.h
* Plugin: plugin.cpp, plugin.h, CMakeLists.txt
-* Utilities: misc-util.cpp
-* Locale: Any locale files
+* Utilities: utility.cpp, utility.h
+* Locale: strings.h, Any locale files
* Resources: Any resource files
### Commits
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/Version.h.in -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/Version.h.in
Changed
+/*
+MIT License
-#define PLUGIN_VERSION_MAJOR @enc-amf_VERSION_MAJOR@
-#define PLUGIN_VERSION_MINOR @enc-amf_VERSION_MINOR@
-#define PLUGIN_VERSION_PATCH @enc-amf_VERSION_PATCH@
-#define PLUGIN_VERSION_BUILD @enc-amf_VERSION_BUILD@
\ No newline at end of file
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+
+const uint16_t PLUGIN_VERSION_MAJOR = @enc-amf_VERSION_MAJOR@;
+const uint16_t PLUGIN_VERSION_MINOR = @enc-amf_VERSION_MINOR@;
+const uint32_t PLUGIN_VERSION_PATCH = @enc-amf_VERSION_PATCH@;
+const uint64_t PLUGIN_VERSION_FULL = (((uint64_t)(PLUGIN_VERSION_MAJOR & 0xFFFF) << 48ull) | ((uint64_t)(PLUGIN_VERSION_MINOR & 0xFFFF) << 32ull) | ((uint64_t)(PLUGIN_VERSION_PATCH) & 0xFFFFFFFF));
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/amf-capabilities.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/amf-capabilities.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
// Plugin
#include "plugin.h"
#include "amf.h"
-#include "amf-h264.h"
+#include "amf-encoder.h"
+#include "amf-encoder-h264.h"
+#include "amf-encoder-h265.h"
#include "api-base.h"
// AMF
namespace Plugin {
namespace AMD {
- volatile struct VCEDeviceCapabilities {
- amf::AMF_ACCELERATION_TYPE acceleration_type;
- uint32_t maxProfile;
- uint32_t maxProfileLevel;
- uint32_t maxBitrate;
- uint32_t minReferenceFrames;
- uint32_t maxReferenceFrames;
- bool supportsBFrames;
- bool supportsFixedSliceMode;
- uint32_t maxTemporalLayers;
- uint32_t maxNumOfStreams;
- uint32_t maxNumOfHwInstances;
-
- struct IOCaps {
- int32_t minWidth, maxWidth;
- int32_t minHeight, maxHeight;
- bool supportsInterlaced;
- uint32_t verticalAlignment;
-
- std::vector<std::pair<amf::AMF_SURFACE_FORMAT, bool>> formats;
- std::vector<std::pair<amf::AMF_MEMORY_TYPE, bool>> memoryTypes;
- } input, output;
-
- Plugin::AMD::VCEDeviceCapabilities::VCEDeviceCapabilities();
- };
-
- class VCECapabilities {
- //////////////////////////////////////////////////////////////////////////
- // Singleton
- //////////////////////////////////////////////////////////////////////////
+ class CapabilityManager {
+ #pragma region Singleton
public:
- static std::shared_ptr<Plugin::AMD::VCECapabilities> GetInstance();
- static void ReportCapabilities(std::shared_ptr<Plugin::API::Base> api);
- static void ReportAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api,
- Plugin::API::Adapter adapter);
- static void ReportAdapterTypeCapabilities(std::shared_ptr<Plugin::API::Base> api,
- Plugin::API::Adapter adapter,
- H264EncoderType type);
- static void ReportAdapterTypeIOCapabilities(std::shared_ptr<Plugin::API::Base> api,
- Plugin::API::Adapter adapter,
- H264EncoderType type,
- bool output);
-
- //////////////////////////////////////////////////////////////////////////
- // Class
- //////////////////////////////////////////////////////////////////////////
- public:
- VCECapabilities();
- ~VCECapabilities();
-
- bool Refresh();
- std::vector<std::pair<H264EncoderType, VCEDeviceCapabilities>>
- GetAllAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter);
- VCEDeviceCapabilities
- GetAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter, H264EncoderType type);
+ static void Initialize();
+ static CapabilityManager* Instance();
+ static void Finalize();
+
+ private: // Private Initializer & Finalizer
+ CapabilityManager();
+ ~CapabilityManager();
+
+ public: // Remove all Copy operators
+ CapabilityManager(CapabilityManager const&) = delete;
+ void operator=(CapabilityManager const&) = delete;
+ #pragma endregion Singleton
+
+ bool IsCodecSupported(AMD::Codec codec);
+ bool IsCodecSupportedByAPI(AMD::Codec codec, API::Type api);
+ bool IsCodecSupportedByAPIAdapter(AMD::Codec codec, API::Type api, API::Adapter adapter);
private:
- std::map<std::tuple<std::string, Plugin::API::Adapter, Plugin::AMD::H264EncoderType>, VCEDeviceCapabilities> capabilityMap;
+ std::map<
+ std::tuple<API::Type, API::Adapter, AMD::Codec>,
+ bool> m_CapabilityMap;
+
};
}
}
\ No newline at end of file
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/amf-encoder-h264.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+
+#include "plugin.h"
+#include "amf-encoder.h"
+#include "components/VideoEncoderVCE.h"
+
+namespace Plugin {
+ namespace AMD {
+ namespace H264 {
+ enum class SliceMode : uint8_t {
+ Row = 1, // Horizontal?
+ Column = 2, // Vertical?
+ };
+ }
+
+ class EncoderH264 : public Encoder {
+ public:
+ EncoderH264(
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter = API::Adapter::Adapter(),
+ bool useOpenCLSubmission = false, bool useOpenCLConversion = false,
+ ColorFormat colorFormat = ColorFormat::NV12, ColorSpace colorSpace = ColorSpace::BT709, bool fullRangeColor = false,
+ bool useAsyncQueue = false, size_t asyncQueueSize = 0);
+ virtual ~EncoderH264();
+
+ // Properties - Initialization
+ virtual std::vector<Usage> CapsUsage() override;
+ virtual void SetUsage(Usage v) override;
+ virtual Usage GetUsage() override;
+
+ // Properties - Static
+ virtual std::vector<QualityPreset> CapsQualityPreset() override;
+ virtual void SetQualityPreset(QualityPreset v) override;
+ virtual QualityPreset GetQualityPreset() override;
+
+ virtual std::vector<Profile> CapsProfile() override;
+ virtual void SetProfile(Profile v) override;
+ virtual Profile GetProfile() override;
+
+ virtual std::vector<ProfileLevel> CapsProfileLevel() override;
+ virtual void SetProfileLevel(ProfileLevel v) override;
+ virtual ProfileLevel GetProfileLevel() override;
+
+ virtual std::pair<uint64_t, uint64_t> CapsMaximumReferenceFrames() override;
+ virtual void SetMaximumReferenceFrames(uint64_t v) override;
+ virtual uint64_t GetMaximumReferenceFrames() override;
+
+ virtual std::pair<std::pair<uint32_t, uint32_t>, std::pair<uint32_t, uint32_t>> CapsResolution() override;
+ virtual void SetResolution(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetResolution() override;
+
+ virtual void SetAspectRatio(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetAspectRatio() override;
+
+ virtual void SetFrameRate(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetFrameRate() override;
+
+ virtual std::vector<CodingType> CapsCodingType() override;
+ virtual void SetCodingType(CodingType v) override;
+ virtual CodingType GetCodingType() override;
+
+ virtual std::pair<uint32_t, uint32_t> CapsMaximumLongTermReferenceFrames() override;
+ virtual void SetMaximumLongTermReferenceFrames(uint32_t v) override;
+ virtual uint32_t GetMaximumLongTermReferenceFrames() override;
+
+ // Properties - Dynamic
+ virtual std::vector<RateControlMethod> CapsRateControlMethod() override;
+ virtual void SetRateControlMethod(RateControlMethod v) override;
+ virtual RateControlMethod GetRateControlMethod() override;
+
+ virtual std::vector<PrePassMode> CapsPrePassMode() override;
+ virtual void SetPrePassMode(PrePassMode v) override;
+ virtual PrePassMode GetPrePassMode() override;
+
+ virtual void SetVarianceBasedAdaptiveQuantizationEnabled(bool v) override;
+ virtual bool IsVarianceBasedAdaptiveQuantizationEnabled() override;
+
+ virtual void SetFrameSkippingEnabled(bool v) override;
+ virtual bool IsFrameSkippingEnabled() override;
+
+ virtual void SetEnforceHRDEnabled(bool v) override;
+ virtual bool IsEnforceHRDEnabled() override;
+
+ virtual void SetFillerDataEnabled(bool v) override;
+ virtual bool IsFillerDataEnabled() override;
+
+ void SetQPMinimum(uint8_t v);
+ uint8_t GetQPMinimum();
+
+ void SetQPMaximum(uint8_t v);
+ uint8_t GetQPMaximum();
+
+ virtual std::pair<uint64_t, uint64_t> CapsTargetBitrate() override;
+ virtual void SetTargetBitrate(uint64_t v) override;
+ virtual uint64_t GetTargetBitrate() override;
+
+ virtual std::pair<uint64_t, uint64_t> CapsPeakBitrate() override;
+ virtual void SetPeakBitrate(uint64_t v) override;
+ virtual uint64_t GetPeakBitrate() override;
+
+ virtual void SetIFrameQP(uint8_t v) override;
+ virtual uint8_t GetIFrameQP() override;
+
+ virtual void SetPFrameQP(uint8_t v) override;
+ virtual uint8_t GetPFrameQP() override;
+
+ virtual void SetBFrameQP(uint8_t v);
+ virtual uint8_t GetBFrameQP();
+
+ virtual void SetMaximumAccessUnitSize(uint32_t v) override;
+ virtual uint32_t GetMaximumAccessUnitSize() override;
+
+ virtual std::pair<uint64_t, uint64_t> CapsVBVBufferSize() override;
+ virtual void SetVBVBufferSize(uint64_t v) override;
+ virtual uint64_t GetVBVBufferSize() override;
+
+ virtual void SetVBVBufferInitialFullness(double v) override;
+ virtual float GetInitialVBVBufferFullness() override;
+
+ // Properties - Picture Control
+ virtual void SetIDRPeriod(uint32_t v) override;
+ virtual uint32_t GetIDRPeriod() override;
+
+ void SetHeaderInsertionSpacing(uint32_t v);
+ uint32_t GetHeaderInsertionSpacing();
+
+ virtual void SetGOPAlignmentEnabled(bool v) override;
+ virtual bool IsGOPAlignmentEnabled() override;
+
+ virtual void SetDeblockingFilterEnabled(bool v) override;
+ virtual bool IsDeblockingFilterEnabled() override;
+
+ virtual uint8_t CapsBFramePattern();
+ virtual void SetBFramePattern(uint8_t v);
+ virtual uint8_t GetBFramePattern();
+
+ virtual void SetBFrameDeltaQP(int8_t v);
+ virtual int8_t GetBFrameDeltaQP();
+
+ virtual void SetBFrameReferenceEnabled(bool v);
+ virtual bool IsBFrameReferenceEnabled();
+
+ virtual void SetBFrameReferenceDeltaQP(int8_t v);
+ virtual int8_t GetBFrameReferenceDeltaQP();
+
+ // Properties - Motion Estimation
+ virtual void SetMotionEstimationQuarterPixelEnabled(bool v) override;
+ virtual bool IsMotionEstimationQuarterPixelEnabled() override;
+
+ virtual void SetMotionEstimationHalfPixelEnabled(bool v) override;
+ virtual bool IsMotionEstimationHalfPixelEnabled() override;
+
+ // Properties - Intra-Refresh
+ std::pair<uint32_t, uint32_t> CapsIntraRefreshNumMBsPerSlot();
+ void SetIntraRefreshNumMBsPerSlot(uint32_t v);
+ uint32_t GetIntraRefreshNumMBsPerSlot();
+
+ void SetIntraRefreshNumOfStripes(uint32_t v);
+ uint32_t GetIntraRefreshNumOfStripes();
+
+ // Properties - Slicing
+ void SetSliceMode(H264::SliceMode v);
+ H264::SliceMode GetSliceMode();
+
+ virtual std::pair<uint32_t, uint32_t> CapsSlicesPerFrame() override;
+ virtual void SetSlicesPerFrame(uint32_t v) override;
+ virtual uint32_t GetSlicesPerFrame() override;
+
+ virtual void SetSliceControlMode(SliceControlMode v) override;
+ virtual SliceControlMode GetSliceControlMode() override;
+
+ virtual std::pair<uint32_t, uint32_t> CapsSliceControlSize() override;
+ virtual void SetSliceControlSize(uint32_t v) override;
+ virtual uint32_t GetSliceControlSize() override;
+
+ std::pair<uint32_t, uint32_t> CapsMaximumSliceSize();
+ void SetMaximumSliceSize(uint32_t v);
+ uint32_t GetMaximumSliceSize();
+
+ // Properties - Experimental
+ virtual void SetLowLatencyInternal(bool v) override;
+ virtual bool GetLowLatencyInternal() override;
+
+ virtual void SetCommonLowLatencyInternal(bool v) override;
+ virtual bool GetCommonLowLatencyInternal() override;
+
+ // Internal
+ virtual void LogProperties() override;
+ protected:
+ virtual void PacketPriorityAndKeyframe(amf::AMFDataPtr& d, struct encoder_packet* p) override;
+ virtual AMF_RESULT GetExtraDataInternal(amf::AMFVariant* p) override;
+ virtual std::string HandleTypeOverride(amf::AMFSurfacePtr& d, uint64_t index) override;
+
+ AMF_VIDEO_ENCODER_PICTURE_TYPE_ENUM m_FrameSkipType = AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE;
+ };
+ }
+}
\ No newline at end of file
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/amf-encoder-h265.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+
+#include "plugin.h"
+#include "amf-encoder.h"
+#include "components/VideoEncoderHEVC.h"
+
+namespace Plugin {
+ namespace AMD {
+ namespace H265 {
+ enum class Tier : uint8_t {
+ Main,
+ High,
+ };
+ enum class GOPType : uint8_t {
+ Fixed, // Fixed Interval GOP
+ Variable, // Variable Interval GOP
+ };
+ enum class HeaderInsertionMode : uint8_t {
+ None = 0,
+ AlignedToGOP = 1,
+ AlignedToIDR = 2,
+ };
+ }
+
+ class EncoderH265 : public Encoder {
+ public:
+ EncoderH265(
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter = API::Adapter::Adapter(),
+ bool useOpenCLSubmission = false, bool useOpenCLConversion = false,
+ ColorFormat colorFormat = ColorFormat::NV12, ColorSpace colorSpace = ColorSpace::BT709, bool fullRangeColor = false,
+ bool useAsyncQueue = false, size_t asyncQueueSize = 0);
+ virtual ~EncoderH265();
+
+ // Initialization
+ virtual std::vector<Usage> CapsUsage() override;
+ virtual void SetUsage(Usage v) override;
+ virtual Usage GetUsage() override;
+
+ // Static
+ virtual std::vector<QualityPreset> CapsQualityPreset() override;
+ virtual void SetQualityPreset(QualityPreset v) override;
+ virtual QualityPreset GetQualityPreset() override;
+
+ virtual std::pair<std::pair<uint32_t, uint32_t>, std::pair<uint32_t, uint32_t>> CapsResolution() override;
+ virtual void SetResolution(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetResolution() override;
+
+ virtual void SetAspectRatio(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetAspectRatio() override;
+
+ virtual void SetFrameRate(std::pair<uint32_t, uint32_t> v) override;
+ virtual std::pair<uint32_t, uint32_t> GetFrameRate() override;
+
+ virtual std::vector<Profile> CapsProfile() override;
+ virtual void SetProfile(Profile v) override;
+ virtual Profile GetProfile() override;
+
+ virtual std::vector<ProfileLevel> CapsProfileLevel() override;
+ virtual void SetProfileLevel(ProfileLevel v) override;
+ virtual ProfileLevel GetProfileLevel() override;
+
+ std::vector<H265::Tier> CapsTier();
+ void SetTier(H265::Tier v);
+ H265::Tier GetTier();
+
+ virtual std::pair<uint64_t, uint64_t> CapsMaximumReferenceFrames() override;
+ virtual void SetMaximumReferenceFrames(uint64_t v) override;
+ virtual uint64_t GetMaximumReferenceFrames() override;
+
+ virtual std::vector<CodingType> CapsCodingType() override;
+ virtual void SetCodingType(CodingType v) override;
+ virtual CodingType GetCodingType() override;
+
+ virtual std::pair<uint32_t, uint32_t> CapsMaximumLongTermReferenceFrames() override;
+ virtual void SetMaximumLongTermReferenceFrames(uint32_t v) override;
+ virtual uint32_t GetMaximumLongTermReferenceFrames() override;
+
+ /// Rate Control
+ virtual std::vector<RateControlMethod> CapsRateControlMethod() override;
+ virtual void SetRateControlMethod(RateControlMethod v) override;
+ virtual RateControlMethod GetRateControlMethod() override;
+
+ virtual std::vector<PrePassMode> CapsPrePassMode() override;
+ virtual void SetPrePassMode(PrePassMode v) override;
+ virtual PrePassMode GetPrePassMode() override;
+
+ virtual void SetVarianceBasedAdaptiveQuantizationEnabled(bool v) override;
+ virtual bool IsVarianceBasedAdaptiveQuantizationEnabled() override;
+
+ /// VBV Buffer
+ virtual std::pair<uint64_t, uint64_t> CapsVBVBufferSize() override;
+ virtual void SetVBVBufferSize(uint64_t v) override;
+ virtual uint64_t GetVBVBufferSize() override;
+
+ virtual void SetVBVBufferInitialFullness(double v) override;
+ virtual float GetInitialVBVBufferFullness() override;
+
+ /// Picture Control
+ std::vector<H265::GOPType> CapsGOPType();
+ void SetGOPType(H265::GOPType v);
+ H265::GOPType GetGOPType();
+
+ void SetGOPSize(uint32_t v);
+ uint32_t GetGOPSize();
+
+ void SetGOPSizeMin(uint32_t v);
+ uint32_t GetGOPSizeMin();
+
+ void SetGOPSizeMax(uint32_t v);
+ uint32_t GetGOPSizeMax();
+
+ virtual void SetGOPAlignmentEnabled(bool v) override;
+ virtual bool IsGOPAlignmentEnabled() override;
+
+ virtual void SetIDRPeriod(uint32_t v) override; // Distance in GOPs
+ virtual uint32_t GetIDRPeriod() override;
+
+ void SetHeaderInsertionMode(H265::HeaderInsertionMode v);
+ H265::HeaderInsertionMode GetHeaderInsertionMode();
+
+ virtual void SetDeblockingFilterEnabled(bool v) override;
+ virtual bool IsDeblockingFilterEnabled() override;
+
+ /// Motion Estimation
+ virtual void SetMotionEstimationQuarterPixelEnabled(bool v) override;
+ virtual bool IsMotionEstimationQuarterPixelEnabled() override;
+
+ virtual void SetMotionEstimationHalfPixelEnabled(bool v) override;
+ virtual bool IsMotionEstimationHalfPixelEnabled() override;
+
+ // Dynamic
+ virtual void SetFrameSkippingEnabled(bool v) override;
+ virtual bool IsFrameSkippingEnabled() override;
+
+ virtual void SetEnforceHRDEnabled(bool v) override;
+ virtual bool IsEnforceHRDEnabled() override;
+
+ virtual void SetFillerDataEnabled(bool v) override;
+ virtual bool IsFillerDataEnabled() override;
+
+ void SetIFrameQPMinimum(uint8_t v);
+ uint8_t GetIFrameQPMinimum();
+
+ void SetIFrameQPMaximum(uint8_t v);
+ uint8_t GetIFrameQPMaximum();
+
+ void SetPFrameQPMinimum(uint8_t v);
+ uint8_t GetPFrameQPMinimum();
+
+ void SetPFrameQPMaximum(uint8_t v);
+ uint8_t GetPFrameQPMaximum();
+
+ virtual std::pair<uint64_t, uint64_t> CapsTargetBitrate() override;
+ virtual void SetTargetBitrate(uint64_t v) override;
+ virtual uint64_t GetTargetBitrate() override;
+
+ virtual std::pair<uint64_t, uint64_t> CapsPeakBitrate() override;
+ virtual void SetPeakBitrate(uint64_t v) override;
+ virtual uint64_t GetPeakBitrate() override;
+
+ virtual void SetIFrameQP(uint8_t v) override;
+ virtual uint8_t GetIFrameQP() override;
+
+ virtual void SetPFrameQP(uint8_t v) override;
+ virtual uint8_t GetPFrameQP() override;
+
+ virtual void SetMaximumAccessUnitSize(uint32_t v) override;
+ virtual uint32_t GetMaximumAccessUnitSize() override;
+
+ /// Intra-Refresh
+ void SetIntraRefreshMode(uint32_t v); // Description is identical to IntraRefreshNumMBsPerSlot?
+ uint32_t GetIntraRefreshMode(); // Does not seem to be an actual property yet.
+
+ void SetIntraRefreshFrameNum(uint32_t v);
+ uint32_t GetIntraRefreshFrameNum();
+
+ /// Slicing
+ virtual std::pair<uint32_t, uint32_t> CapsSlicesPerFrame() override;
+ virtual void SetSlicesPerFrame(uint32_t v) override;
+ virtual uint32_t GetSlicesPerFrame() override;
+
+ virtual void SetSliceControlMode(SliceControlMode v) override;
+ virtual SliceControlMode GetSliceControlMode() override;
+
+ virtual std::pair<uint32_t, uint32_t> CapsSliceControlSize() override;
+ virtual void SetSliceControlSize(uint32_t v) override;
+ virtual uint32_t GetSliceControlSize() override;
+
+ // Experimental
+ void SetQPCBOffset(uint8_t v);
+ uint8_t GetQPCBOffset();
+
+ void SetQPCROffset(uint8_t v);
+ uint8_t GetQPCROffset();
+
+ std::pair<uint32_t, uint32_t> CapsInputQueueSize();
+ void SetInputQueueSize(uint32_t v);
+ uint32_t GetInputQueueSize();
+
+ virtual void SetLowLatencyInternal(bool v) override;
+ virtual bool GetLowLatencyInternal() override;
+
+ virtual void SetCommonLowLatencyInternal(bool v) override;
+ virtual bool GetCommonLowLatencyInternal() override;
+
+ // Internal
+ virtual void LogProperties() override;
+ protected:
+ virtual void PacketPriorityAndKeyframe(amf::AMFDataPtr& d, struct encoder_packet* p) override;
+ virtual AMF_RESULT GetExtraDataInternal(amf::AMFVariant* p) override;
+ virtual std::string HandleTypeOverride(amf::AMFSurfacePtr& d, uint64_t index) override;
+
+ AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_ENUM m_FrameSkipType = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_NONE;
+
+
+
+ //Remaining Properties
+ // PerformanceCounter (Interface, but which one?)
+ // HevcMaxNumOfTemporalLayers/HevcNumOfTemporalLayers/HevcTemporalLayerSelect - Only supports QP_I/P?
+ // BPicturesPattern (replaced by merge mode?)
+ // HevcMaxMBPerSec (PCI-E bandwidth, min/max)
+ };
+ }
+}
\ No newline at end of file
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/amf-encoder.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+
+#include <vector>
+#include <queue>
+#include <thread>
+#include <mutex>
+#include <condition_variable>
+#include <chrono>
+
+#include "plugin.h"
+#include "amf.h"
+#include "api-base.h"
+
+#include "components/Component.h"
+
+#define AMF_TIMESTAMP_ALLOCATE L"TS_Allocate"
+#define AMF_TIME_ALLOCATE L"T_Allocate"
+#define AMF_TIMESTAMP_STORE L"TS_Store"
+#define AMF_TIME_STORE L"T_Store"
+#define AMF_TIMESTAMP_CONVERT L"TS_Convert"
+#define AMF_TIME_CONVERT L"T_Convert"
+#define AMF_TIMESTAMP_SUBMIT L"TS_Submit"
+#define AMF_TIMESTAMP_QUERY L"TS_Query"
+#define AMF_TIME_MAIN L"T_Main" // Time between Submit and Query
+
+#define AMF_PRESENT_TIMESTAMP L"PTS"
+
+#ifdef _DEBUG
+#define AMFTRACECALL { \
+ std::mbstate_t state = std::mbstate_t(); \
+ auto trace = AMF::Instance()->GetTrace(); \
+ const char* file = __FILE__; \
+ const char* fname = __FUNCTION_NAME__; \
+ std::vector<wchar_t> buf(std::mbsrtowcs(NULL, &file, 0, &state) + 1); \
+ std::mbsrtowcs(buf.data(), &file, buf.size(), &state); \
+ std::vector<wchar_t> buf2(std::mbsrtowcs(NULL, &fname, 0, &state) + 1); \
+ std::mbsrtowcs(buf2.data(), &fname, buf2.size(), &state); \
+ trace->TraceW(buf.data(), __LINE__, AMF_TRACE_DEBUG, L"Trace", 1, L"Function: %s", buf2.data()); \
+ PLOG_DEBUG("<Trace> " __FUNCTION_NAME__); \
+};
+#else
+#define AMFTRACECALL ;
+#endif
+
+namespace Plugin {
+ namespace AMD {
+ // Initialization Parameters
+ enum class Codec : uint8_t {
+ AVC,
+ SVC,
+ HEVC,
+ };
+ enum class ColorFormat : uint8_t {
+ /* Support Table
+ * Open Broadcaster AMD AMF
+ * -- 4:2:0 Formats --
+ * VIDEO_FORMAT_I420 AMF_SURFACE_YUV420P
+ * VIDEO_FORMAT_NV12 AMF_SURFACE_NV12
+ * AMF_SURFACE_YV12
+ * -- 4:2:2 Formats --
+ * VIDEO_FORMAT_YVYV
+ * VIDEO_FORMAT_YUY2 AMF_SURFACE_YUY2
+ * VIDEO_FORMAT_UYVY
+ *
+ * -- 4:4:4 Formats --
+ * VIDEO_FORMAT_I444
+ *
+ * -- Packed Uncompressed Formats --
+ * VIDEO_FORMAT_RGBA AMF_SURFACE_RGBA
+ * VIDEO_FORMAT_BGRA AMF_SURFACE_BGRA
+ * AMF_SURFACE_ARGB
+ * VIDEO_FORMAT_BGRX
+ *
+ * -- Single/Dual Channel Formats --
+ * VIDEO_FORMAT_Y800 AMF_SURFACE_GRAY8
+ * AMF_SURFACE_U8V8
+ *
+ * -- HDR Color Formats --
+ * AMF_SURFACE_P010
+ * AMF_SURFACE_RGBA_F16
+ */
+
+ I420,
+ NV12,
+ YUY2,
+ BGRA,
+ RGBA,
+ GRAY,
+ };
+ enum class ColorSpace : uint8_t {
+ BT601,
+ BT709,
+ BT2020,
+ };
+
+ // Properties
+ enum class Usage : uint8_t {
+ Transcoding,
+ UltraLowLatency,
+ LowLatency,
+ Webcam
+ };
+ enum class QualityPreset : uint8_t {
+ Speed,
+ Balanced,
+ Quality,
+ };
+ enum class Profile : uint16_t {
+ ConstrainedBaseline = 256,
+ Baseline = 66,
+ Main = 77,
+ ConstrainedHigh = 257,
+ High = 100,
+ };
+ enum class ProfileLevel : uint8_t {
+ Automatic,
+ L10 = 10,
+ L11,
+ L12,
+ L13,
+ L20 = 20,
+ L21,
+ L22,
+ L30 = 30,
+ L31,
+ L32,
+ L40 = 40,
+ L41,
+ L42,
+ L50 = 50,
+ L51,
+ L52,
+ L60 = 60,
+ L61,
+ L62,
+ };
+ enum class CodingType : uint8_t {
+ Automatic,
+ CALVC,
+ CABAC,
+ };
+ enum class RateControlMethod : uint8_t {
+ ConstantQP,
+ LatencyConstrainedVariableBitrate,
+ PeakConstrainedVariableBitrate,
+ ConstantBitrate,
+ };
+ enum class PrePassMode : uint8_t {
+ Disabled,
+ Enabled,
+ EnabledAtHalfScale,
+ EnabledAtQuarterScale,
+ };
+ enum class SliceControlMode : uint8_t {
+ Unknown0,
+ Unknown1,
+ Unknown2,
+ Unknown3,
+ };
+
+ class Encoder {
+ protected:
+ Encoder(Codec codec,
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter,
+ bool useOpenCLSubmission, bool useOpenCLConversion,
+ ColorFormat colorFormat, ColorSpace colorSpace, bool fullRangeColor,
+ bool useAsyncQueue, size_t asyncQueueSize);
+ public:
+ virtual ~Encoder();
+
+ public:
+
+ #pragma region Initialization
+ uint64_t GetUniqueId();
+
+ //void SetCodec(Codec v);
+ Codec GetCodec();
+
+ //void SetVideoAPI(std::shared_ptr<API::IAPI> v);
+ std::shared_ptr<API::IAPI> GetVideoAPI();
+
+ //void SetVideoAdapter(API::Adapter v);
+ API::Adapter GetVideoAdapter();
+
+ //void SetOpenCLEnabled(bool v);
+ bool IsOpenCLEnabled();
+
+ //void SetColorFormat(ColorFormat v);
+ ColorFormat GetColorFormat();
+
+ //void SetColorSpace(ColorSpace v);
+ ColorSpace GetColorSpace();
+
+ //void SetFullRangeColor(bool v);
+ bool IsFullRangeColor();
+
+ //void SetAsynchronousQueueEnabled(bool v);
+ bool IsAsynchronousQueueEnabled();
+
+ //void SetAsynchronousQueueSize(size_t v);
+ size_t GetAsynchronousQueueSize();
+
+ //bool Initialize();
+ #pragma endregion Initialization
+
+ #pragma region Settings
+ virtual std::vector<Usage> CapsUsage() = 0;
+ virtual void SetUsage(Usage v) = 0;
+ virtual Usage GetUsage() = 0;
+
+ virtual std::vector<QualityPreset> CapsQualityPreset() = 0;
+ virtual void SetQualityPreset(QualityPreset v) = 0;
+ virtual QualityPreset GetQualityPreset() = 0;
+
+ #pragma region Frame
+ virtual std::pair<std::pair<uint32_t, uint32_t>, std::pair<uint32_t, uint32_t>> CapsResolution() = 0;
+ virtual void SetResolution(std::pair<uint32_t, uint32_t> v) = 0;
+ virtual std::pair<uint32_t, uint32_t> GetResolution() = 0;
+
+ virtual void SetAspectRatio(std::pair<uint32_t, uint32_t> v) = 0;
+ virtual std::pair<uint32_t, uint32_t> GetAspectRatio() = 0;
+
+ virtual void SetFrameRate(std::pair<uint32_t, uint32_t> v) = 0;
+ virtual std::pair<uint32_t, uint32_t> GetFrameRate() = 0;
+ #pragma endregion Frame
+
+ #pragma region Profile
+ virtual std::vector<Profile> CapsProfile() = 0;
+ virtual void SetProfile(Profile v) = 0;
+ virtual Profile GetProfile() = 0;
+
+ virtual std::vector<ProfileLevel> CapsProfileLevel() = 0;
+ virtual void SetProfileLevel(ProfileLevel v) = 0;
+ virtual ProfileLevel GetProfileLevel() = 0;
+ #pragma endregion Profile
+
+ virtual std::vector<CodingType> CapsCodingType() = 0;
+ virtual void SetCodingType(CodingType v) = 0;
+ virtual CodingType GetCodingType() = 0;
+
+ #pragma region Reference Frames
+ virtual std::pair<uint64_t, uint64_t> CapsMaximumReferenceFrames() = 0;
+ virtual void SetMaximumReferenceFrames(uint64_t v) = 0;
+ virtual uint64_t GetMaximumReferenceFrames() = 0;
+
+ virtual std::pair<uint32_t, uint32_t> CapsMaximumLongTermReferenceFrames() = 0;
+ virtual void SetMaximumLongTermReferenceFrames(uint32_t v) = 0;
+ virtual uint32_t GetMaximumLongTermReferenceFrames() = 0;
+ #pragma endregion Reference Frames
+
+ virtual std::vector<RateControlMethod> CapsRateControlMethod() = 0;
+ virtual void SetRateControlMethod(RateControlMethod v) = 0;
+ virtual RateControlMethod GetRateControlMethod() = 0;
+
+ virtual std::vector<PrePassMode> CapsPrePassMode() = 0;
+ virtual void SetPrePassMode(PrePassMode v) = 0;
+ virtual PrePassMode GetPrePassMode() = 0;
+
+ virtual void SetVarianceBasedAdaptiveQuantizationEnabled(bool v) = 0;
+ virtual bool IsVarianceBasedAdaptiveQuantizationEnabled() = 0;
+
+ virtual void SetFrameSkippingEnabled(bool v) = 0;
+ virtual bool IsFrameSkippingEnabled() = 0;
+
+ virtual void SetFrameSkippingPeriod(uint32_t v);
+ virtual uint32_t GetFrameSkippingPeriod();
+
+ virtual void SetFrameSkippingBehaviour(bool v);
+ virtual bool GetFrameSkippingBehaviour();
+
+ /// Enforce Hypothethical Reference Decoder Restrictions
+ virtual void SetEnforceHRDEnabled(bool v) = 0;
+ virtual bool IsEnforceHRDEnabled() = 0;
+
+ virtual void SetFillerDataEnabled(bool v) = 0;
+ virtual bool IsFillerDataEnabled() = 0;
+
+ virtual std::pair<uint64_t, uint64_t> CapsTargetBitrate() = 0;
+ virtual void SetTargetBitrate(uint64_t v) = 0;
+ virtual uint64_t GetTargetBitrate() = 0;
+
+ virtual std::pair<uint64_t, uint64_t> CapsPeakBitrate() = 0;
+ virtual void SetPeakBitrate(uint64_t v) = 0;
+ virtual uint64_t GetPeakBitrate() = 0;
+
+ virtual void SetIFrameQP(uint8_t v) = 0;
+ virtual uint8_t GetIFrameQP() = 0;
+
+ virtual void SetPFrameQP(uint8_t v) = 0;
+ virtual uint8_t GetPFrameQP() = 0;
+
+ virtual void SetMaximumAccessUnitSize(uint32_t v) = 0;
+ virtual uint32_t GetMaximumAccessUnitSize() = 0;
+
+ #pragma region Video Buffering Verifier
+ virtual std::pair<uint64_t, uint64_t> CapsVBVBufferSize() = 0;
+ virtual void SetVBVBufferSize(uint64_t v) = 0;
+ void SetVBVBufferStrictness(double_t v);
+ virtual uint64_t GetVBVBufferSize() = 0;
+
+ virtual void SetVBVBufferInitialFullness(double v) = 0;
+ virtual float GetInitialVBVBufferFullness() = 0;
+ #pragma endregion Video Buffering Verifier
+
+ #pragma region Picture Control
+ virtual void SetIDRPeriod(uint32_t v) = 0;
+ virtual uint32_t GetIDRPeriod() = 0;
+
+ virtual void SetIFramePeriod(uint32_t v);
+ virtual uint32_t GetIFramePeriod();
+
+ virtual void SetPFramePeriod(uint32_t v);
+ virtual uint32_t GetPFramePeriod();
+
+ virtual void SetBFramePeriod(uint32_t v);
+ virtual uint32_t GetBFramePeriod();
+
+ virtual void SetGOPAlignmentEnabled(bool v) = 0;
+ virtual bool IsGOPAlignmentEnabled() = 0;
+
+ virtual void SetDeblockingFilterEnabled(bool v) = 0;
+ virtual bool IsDeblockingFilterEnabled() = 0;
+ #pragma endregion Picture Control
+
+ #pragma region Motion Estimation
+ virtual void SetMotionEstimationQuarterPixelEnabled(bool v) = 0;
+ virtual bool IsMotionEstimationQuarterPixelEnabled() = 0;
+
+ virtual void SetMotionEstimationHalfPixelEnabled(bool v) = 0;
+ virtual bool IsMotionEstimationHalfPixelEnabled() = 0;
+ #pragma endregion Motion Estimation
+
+ #pragma region Slicing
+ virtual std::pair<uint32_t, uint32_t> CapsSlicesPerFrame() = 0;
+ virtual void SetSlicesPerFrame(uint32_t v) = 0;
+ virtual uint32_t GetSlicesPerFrame() = 0;
+
+ virtual void SetSliceControlMode(SliceControlMode v) = 0; // 0-1 range, Horz/Vert perhaps?
+ virtual SliceControlMode GetSliceControlMode() = 0;
+
+ virtual std::pair<uint32_t, uint32_t> CapsSliceControlSize() = 0;
+ virtual void SetSliceControlSize(uint32_t v) = 0;
+ virtual uint32_t GetSliceControlSize() = 0;
+ #pragma endregion Slicing
+
+ #pragma region Internal
+ virtual void SetLowLatencyInternal(bool v) = 0;
+ virtual bool GetLowLatencyInternal() = 0;
+
+ virtual void SetCommonLowLatencyInternal(bool v) = 0;
+ virtual bool GetCommonLowLatencyInternal() = 0;
+ #pragma endregion Internal
+ #pragma endregion Settings
+
+ #pragma region Control
+ void Start();
+ void Restart();
+ void Stop();
+
+ bool IsStarted();
+ virtual void LogProperties() = 0;
+
+ bool Encode(struct encoder_frame* f, struct encoder_packet* p, bool* b);
+ void GetVideoInfo(struct video_scale_info* info);
+ bool GetExtraData(uint8_t** extra_data, size_t* size);
+ #pragma endregion Control
+
+ protected:
+ void UpdateFrameRateValues();
+
+ private:
+ virtual void PacketPriorityAndKeyframe(amf::AMFDataPtr& d, struct encoder_packet* p) = 0;
+ virtual AMF_RESULT GetExtraDataInternal(amf::AMFVariant* p) = 0;
+ virtual std::string HandleTypeOverride(amf::AMFSurfacePtr& d, uint64_t index) = 0;
+
+ bool EncodeAllocate(OUT amf::AMFSurfacePtr& surface);
+ bool EncodeStore(OUT amf::AMFSurfacePtr& surface, IN struct encoder_frame* frame);
+ bool EncodeConvert(IN amf::AMFSurfacePtr& surface, OUT amf::AMFDataPtr& data);
+ bool EncodeMain(IN amf::AMFDataPtr& data, OUT amf::AMFDataPtr& packet);
+ bool EncodeLoad(IN amf::AMFDataPtr& data, OUT struct encoder_packet* packet, OUT bool* received_packet);
+
+ static int32_t AsyncSendMain(Encoder* obj);
+ int32_t AsyncSendLocalMain();
+ static int32_t AsyncRetrieveMain(Encoder* obj);
+ int32_t AsyncRetrieveLocalMain();
+
+ protected:
+ // AMF Internals
+ Plugin::AMD::AMF* m_AMF;
+ amf::AMFFactory* m_AMFFactory;
+ amf::AMFContextPtr m_AMFContext;
+ amf::AMFComputePtr m_AMFCompute;
+ amf::AMFComponentPtr m_AMFEncoder;
+ amf::AMFComponentPtr m_AMFConverter;
+ amf::AMF_MEMORY_TYPE m_AMFMemoryType;
+ amf::AMF_SURFACE_FORMAT m_AMFSurfaceFormat;
+
+ // API Related
+ std::shared_ptr<API::IAPI> m_API;
+ API::Adapter m_APIAdapter;
+ std::shared_ptr<API::Instance> m_APIDevice;
+
+ // Buffers
+ std::vector<uint8_t> m_PacketDataBuffer;
+ std::vector<uint8_t> m_ExtraDataBuffer;
+
+ // Flags
+ bool m_Initialized;
+ bool m_Started;
+ bool m_OpenCL;
+ bool m_OpenCLSubmission; // Submit Frames using OpenCL
+ bool m_OpenCLConversion; // Convert Frames using OpenCL instead of DirectCompute
+ bool m_HaveFirstFrame;
+
+ // Properties
+ uint64_t m_UniqueId;
+ Codec m_Codec;
+ ColorFormat m_ColorFormat;
+ ColorSpace m_ColorSpace;
+ bool m_FullColorRange;
+ std::pair<uint32_t, uint32_t> m_Resolution;
+ std::pair<uint32_t, uint32_t> m_FrameRate;
+ double_t m_FrameRateFraction;
+ double_t m_TimestampStep;
+ uint64_t m_TimestampStepRounded;
+ uint64_t m_TimestampOffset = 0;
+ std::chrono::nanoseconds m_SubmitQueryWaitTimer;
+ uint64_t m_SubmitQueryAttempts = 8;
+ uint32_t m_PeriodIDR = 1;
+ uint32_t m_PeriodIFrame = 0;
+ uint32_t m_PeriodPFrame = 0;
+ uint32_t m_PeriodBFrame = 0;
+ uint32_t m_FrameSkipPeriod = 0;
+ bool m_FrameSkipKeepOnlyNth = false; // false = drop every xth frame, true = drop all but every xth frame
+
+ // Threading
+ bool m_AsyncQueue;
+ size_t m_AsyncQueueSize;
+ struct EncoderThreadingData {
+ // Thread
+ std::thread worker;
+ bool shutdown;
+ // Semaphore
+ size_t wakeupcount;
+ std::condition_variable condvar;
+ std::mutex mutex;
+ // Data
+ std::queue<amf::AMFDataPtr> queue;
+ } *m_AsyncSend, *m_AsyncRetrieve;
+ };
+ }
+}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/amf.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/amf.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
class AMF {
#pragma region Singleton
public:
- static std::shared_ptr<Plugin::AMD::AMF> GetInstance();
- #pragma endregion Singleton
+ static void Initialize();
+ static AMF* Instance();
+ static void Finalize();
- public:
+ private: // Private Initializer & Finalizer
AMF();
~AMF();
+ public: // Remove all Copy operators
+ AMF(AMF const&) = delete;
+ void operator=(AMF const&) = delete;
+ #pragma endregion Singleton
+
+ public:
amf::AMFFactory* GetFactory();
amf::AMFTrace* GetTrace();
amf::AMFDebug* GetDebug();
/// AMF Values
HMODULE m_AMFModule;
- uint64_t m_AMFVersion_Compiler;
+ uint64_t m_AMFVersion_Plugin;
uint64_t m_AMFVersion_Runtime;
/// AMF Functions
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/api-base.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/api-base.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "plugin.h"
-
#include <vector>
#include <map>
+#include <string.h>
+#include <memory>
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
namespace Plugin {
namespace API {
- enum class Type {
+ /**
+ *
+ */
+ enum class Type : uint8_t {
Host,
Direct3D9,
Direct3D11,
OpenGL,
};
+ // An Adapter on an API
struct Adapter {
int32_t idLow, idHigh;
std::string Name;
- Adapter() : idLow(0), idHigh(0), Name("Invalid Device") {}
- Adapter(int32_t idLow, int32_t idHigh, std::string Name) : idLow(idLow), idHigh(idHigh), Name(Name) {}
+ Adapter()
+ : idLow(0), idHigh(0), Name("Invalid Device") {}
+ Adapter(int32_t p_idLow, int32_t p_idHigh, std::string p_Name)
+ : idLow(p_idLow), idHigh(p_idHigh), Name(p_Name) {}
+ Adapter(Adapter const& o) {
+ idLow = o.idLow;
+ idHigh = o.idHigh;
+ Name = o.Name;
+ }
+ void operator=(Adapter const& o) {
+ idLow = o.idLow;
+ idHigh = o.idHigh;
+ Name = o.Name;
+ }
friend bool operator<(const Plugin::API::Adapter& left, const Plugin::API::Adapter& right);
friend bool operator>(const Plugin::API::Adapter& left, const Plugin::API::Adapter& right);
friend bool operator!=(const Plugin::API::Adapter& left, const Plugin::API::Adapter& right);
};
- class Base {
- //////////////////////////////////////////////////////////////////////////
- // API Index
- //////////////////////////////////////////////////////////////////////////
+ // Instance of an API Adapter
+ struct Instance {
public:
- static void Initialize();
-
- static size_t GetAPICount();
- static std::shared_ptr<Base> GetAPIInstance(size_t index);
- static std::string GetAPIName(size_t index);
- static std::shared_ptr<Base> GetAPIByName(std::string name);
- static std::vector<std::shared_ptr<Base>> EnumerateAPIs();
- static std::vector<std::string> EnumerateAPINames();
-
- //////////////////////////////////////////////////////////////////////////
- // API
- //////////////////////////////////////////////////////////////////////////
+ Instance();
+ virtual ~Instance();
+
+ virtual Adapter GetAdapter() = 0;
+ virtual void* GetContext() = 0;
+ };
+
+ // API Interface
+ class IAPI {
public:
+ IAPI();
+ virtual ~IAPI();
+
virtual std::string GetName() = 0;
virtual Type GetType() = 0;
virtual std::vector<Adapter> EnumerateAdapters() = 0;
- virtual Adapter GetAdapterById(uint32_t idLow, uint32_t idHigh) = 0;
- virtual Adapter GetAdapterByName(std::string name) = 0;
+ Adapter GetAdapterById(int32_t idLow, int32_t idHigh);
+ Adapter GetAdapterByName(std::string name);
- virtual void* CreateInstanceOnAdapter(Adapter adapter) = 0;
- virtual Adapter GetAdapterForInstance(void* instance) = 0;
- virtual void* GetContextFromInstance(void* instance) = 0;
- virtual void DestroyInstance(void* instance) = 0;
+ virtual std::shared_ptr<Instance> CreateInstance(Adapter adapter) = 0;
};
+
+ // Static API Stuff
+ void InitializeAPIs();
+ void FinalizeAPIs();
+ size_t CountAPIs();
+ std::string GetAPIName(size_t index);
+ std::shared_ptr<IAPI> GetAPI(size_t index);
+ std::shared_ptr<IAPI> GetAPI(std::string name);
+ std::shared_ptr<IAPI> GetAPI(Type type);
+ std::vector<std::shared_ptr<IAPI>> EnumerateAPIs();
+ std::vector<std::string> EnumerateAPINames();
}
}
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/api-d3d11.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/api-d3d11.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-base.h"
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
+#include <vector>
+#include <map>
+#include <mutex>
+#include <dxgi.h>
+#include <d3d11.h>
+#include <atlutil.h>
namespace Plugin {
namespace API {
- class Direct3D11 : public Base {
+ class Direct3D11 : public IAPI {
+ friend class Direct3D11Instance;
+ public:
+
+ Direct3D11();
+ ~Direct3D11();
+
virtual std::string GetName() override;
virtual Type GetType() override;
-
virtual std::vector<Adapter> EnumerateAdapters() override;
- virtual Adapter GetAdapterById(uint32_t idLow, uint32_t idHigh) override;
- virtual Adapter GetAdapterByName(std::string name) override;
+ virtual std::shared_ptr<Instance> CreateInstance(Adapter adapter) override;
+
+ protected:
+ ATL::CComPtr<IDXGIFactory1> m_DXGIFactory;
+ //std::mutex m_InstanceMapMutex;
+ //std::map<std::pair<int32_t, int32_t>, std::shared_ptr<Instance>> m_InstanceMap;
+
+ private:
+ std::vector<Adapter> m_AdapterList;
+ };
+
+ class Direct3D11Instance : public Instance {
+ public:
+ Direct3D11Instance(Direct3D11* api, Adapter adapter);
+ ~Direct3D11Instance();
+
+ virtual Adapter GetAdapter() override;
+ virtual void* GetContext() override;
- virtual void* CreateInstanceOnAdapter(Adapter adapter) override;
- virtual Adapter GetAdapterForInstance(void* pInstance) override;
- virtual void* GetContextFromInstance(void* pInstance) override;
- virtual void DestroyInstance(void* pInstance) override;
+ private:
+ Direct3D11* m_API;
+ Adapter m_Adapter;
+ ID3D11DeviceContext* m_DeviceContext;
+ ID3D11Device* m_Device;
};
}
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/api-d3d9.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/api-d3d9.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-base.h"
+#include <d3d9.h>
+#include <atlutil.h>
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
+#ifdef _DEBUG
+#define D3D_DEBUG_INFO
+#endif
+#pragma comment(lib, "d3d9.lib")
namespace Plugin {
namespace API {
- class Direct3D9 : public Base {
+ class Direct3D9 : public IAPI {
+ friend class Direct3D9Instance;
+ public:
+
+ Direct3D9();
+ ~Direct3D9();
+
virtual std::string GetName() override;
virtual Type GetType() override;
-
virtual std::vector<Adapter> EnumerateAdapters() override;
- virtual Adapter GetAdapterById(uint32_t idLow, uint32_t idHigh);
- virtual Adapter GetAdapterByName(std::string name);
+ virtual std::shared_ptr<Instance> CreateInstance(Adapter adapter) override;
+
+ protected:
+ IDirect3D9Ex* m_Direct3D9Ex;
+ //std::map<std::pair<int32_t, int32_t>, std::shared_ptr<Instance>> m_InstanceMap;
+
+ private:
+ std::vector<Adapter> m_Adapters;
+ };
+
+ class Direct3D9Instance : public Instance {
+ public:
+ Direct3D9Instance(Direct3D9* api, Adapter adapter);
+ ~Direct3D9Instance();
+
+ virtual Adapter GetAdapter() override;
+ virtual void* GetContext() override;
- virtual void* CreateInstanceOnAdapter(Adapter adapter) override;
- virtual Adapter GetAdapterForInstance(void* pInstance) override;
- virtual void* GetContextFromInstance(void* pInstance) override;
- virtual void DestroyInstance(void* pInstance) override;
+ private:
+ Direct3D9* m_API;
+ Adapter m_Adapter;
+ IDirect3DDevice9Ex* m_Device;
};
}
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/api-host.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/api-host.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-base.h"
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
-
namespace Plugin {
namespace API {
- class Host : public Base {
+ class Host : public IAPI {
+ public:
virtual std::string GetName() override;
virtual Type GetType() override;
-
virtual std::vector<Adapter> EnumerateAdapters() override;
- virtual Adapter GetAdapterById(uint32_t idLow, uint32_t idHigh);
- virtual Adapter GetAdapterByName(std::string name);
+ virtual std::shared_ptr<Instance> CreateInstance(Adapter adapter) override;
+ };
- virtual void* CreateInstanceOnAdapter(Adapter adapter) override;
- virtual Adapter GetAdapterForInstance(void* pInstance) override;
- virtual void* GetContextFromInstance(void* pInstance) override;
- virtual void DestroyInstance(void* pInstance) override;
+ class HostInstance : public Instance {
+ public:
+ virtual Adapter GetAdapter() override;
+ virtual void* GetContext() override;
};
}
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/api-opengl.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/api-opengl.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-base.h"
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
+#ifdef _WIN32
+#include <windows.h>
+#endif
+#include <gl/GL.h>
namespace Plugin {
namespace API {
- class OpenGL : public Base {
+ class OpenGL : public IAPI {
+ public:
+ OpenGL();
+ ~OpenGL();
+
virtual std::string GetName() override;
virtual Type GetType() override;
-
virtual std::vector<Adapter> EnumerateAdapters() override;
- virtual Adapter GetAdapterById(uint32_t idLow, uint32_t idHigh);
- virtual Adapter GetAdapterByName(std::string name);
+ virtual std::shared_ptr<Instance> CreateInstance(Adapter adapter) override;
+ };
+
+ class OpenGLInstance : public Instance {
+ public:
+ OpenGLInstance();
+ ~OpenGLInstance();
+
+ virtual Adapter GetAdapter() override;
+ virtual void* GetContext() override;
- virtual void* CreateInstanceOnAdapter(Adapter adapter) override;
- virtual Adapter GetAdapterForInstance(void* instance) override;
- virtual void* GetContextFromInstance(void* instance) override;
- virtual void DestroyInstance(void* instance) override;
+ private:
+ Adapter adapter;
};
}
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/enc-h264.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/enc-h264.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-// Plugin
+#include "amf-encoder-h264.h"
#include "plugin.h"
-#include "amf-capabilities.h"
-#include "amf-h264.h"
-//////////////////////////////////////////////////////////////////////////
-// Defines - Translation Strings
-//////////////////////////////////////////////////////////////////////////
-
-// Presets
-#define AMF_H264_PRESET TEXT_AMF_H264("Preset")
-#define AMF_H264_PRESET_RESETTODEFAULTS TEXT_AMF_H264("Preset.ResetToDefaults")
-#define AMF_H264_PRESET_RECORDING TEXT_AMF_H264("Preset.Recording")
-#define AMF_H264_PRESET_HIGHQUALITY TEXT_AMF_H264("Preset.HighQuality")
-#define AMF_H264_PRESET_INDISTINGUISHABLE TEXT_AMF_H264("Preset.Indistinguishable")
-#define AMF_H264_PRESET_LOSSLESS TEXT_AMF_H264("Preset.Lossless")
-#define AMF_H264_PRESET_TWITCH TEXT_AMF_H264("Preset.Twitch")
-#define AMF_H264_PRESET_YOUTUBE TEXT_AMF_H264("Preset.YouTube")
-
-// Startup Properties
-#define AMF_H264_USAGE TEXT_AMF_H264("Usage")
-#define AMF_H264_USAGE_DESCRIPTION TEXT_AMF_H264("Usage.Description")
-#define AMF_H264_USAGE_TRANSCODING TEXT_AMF_H264("Usage.Transcoding")
-#define AMF_H264_USAGE_ULTRALOWLATENCY TEXT_AMF_H264("Usage.UltraLowLatency")
-#define AMF_H264_USAGE_LOWLATENCY TEXT_AMF_H264("Usage.LowLatency")
-#define AMF_H264_USAGE_WEBCAM TEXT_AMF_H264("Usage.Webcam")
-#define AMF_H264_QUALITY_PRESET TEXT_AMF_H264("QualityPreset")
-#define AMF_H264_QUALITY_PRESET_DESCRIPTION TEXT_AMF_H264("QualityPreset.Description")
-#define AMF_H264_QUALITY_PRESET_SPEED TEXT_AMF_H264("QualityPreset.Speed")
-#define AMF_H264_QUALITY_PRESET_BALANCED TEXT_AMF_H264("QualityPreset.Balanced")
-#define AMF_H264_QUALITY_PRESET_QUALITY TEXT_AMF_H264("QualityPreset.Quality")
-#define AMF_H264_PROFILE TEXT_AMF_H264("Profile")
-#define AMF_H264_PROFILE_DESCRIPTION TEXT_AMF_H264("Profile.Description")
-#define AMF_H264_PROFILELEVEL TEXT_AMF_H264("ProfileLevel")
-#define AMF_H264_PROFILELEVEL_DESCRIPTION TEXT_AMF_H264("ProfileLevel.Description")
-
-// Rate Control Properties
-#define AMF_H264_RATECONTROLMETHOD TEXT_AMF_H264("RateControlMethod")
-#define AMF_H264_RATECONTROLMETHOD_DESCRIPTION TEXT_AMF_H264("RateControlMethod.Description")
-#define AMF_H264_RATECONTROLMETHOD_CQP TEXT_AMF_H264("RateControlMethod.CQP")
-#define AMF_H264_RATECONTROLMETHOD_CBR TEXT_AMF_H264("RateControlMethod.CBR")
-#define AMF_H264_RATECONTROLMETHOD_VBR TEXT_AMF_H264("RateControlMethod.VBR.Peak")
-#define AMF_H264_RATECONTROLMETHOD_VBR_LAT TEXT_AMF_H264("RateControlMethod.VBR.Latency")
-#define AMF_H264_BITRATE_TARGET TEXT_AMF_H264("Bitrate.Target")
-#define AMF_H264_BITRATE_TARGET_DESCRIPTION TEXT_AMF_H264("Bitrate.Target.Description")
-#define AMF_H264_BITRATE_PEAK TEXT_AMF_H264("Bitrate.Peak")
-#define AMF_H264_BITRATE_PEAK_DESCRIPTION TEXT_AMF_H264("Bitrate.Peak.Description")
-#define AMF_H264_QP_MINIMUM TEXT_AMF_H264("QP.Minimum")
-#define AMF_H264_QP_MINIMUM_DESCRIPTION TEXT_AMF_H264("QP.Minimum.Description")
-#define AMF_H264_QP_MAXIMUM TEXT_AMF_H264("QP.Maximum")
-#define AMF_H264_QP_MAXIMUM_DESCRIPTION TEXT_AMF_H264("QP.Maximum.Description")
-#define AMF_H264_QP_IFRAME TEXT_AMF_H264("QP.IFrame")
-#define AMF_H264_QP_IFRAME_DESCRIPTION TEXT_AMF_H264("QP.IFrame.Description")
-#define AMF_H264_QP_PFRAME TEXT_AMF_H264("QP.PFrame")
-#define AMF_H264_QP_PFRAME_DESCRIPTION TEXT_AMF_H264("QP.PFrame.Description")
-#define AMF_H264_QP_BFRAME TEXT_AMF_H264("QP.BFrame")
-#define AMF_H264_QP_BFRAME_DESCRIPTION TEXT_AMF_H264("QP.BFrame.Description")
-#define AMF_H264_VBVBUFFER TEXT_AMF_H264("VBVBuffer")
-#define AMF_H264_VBVBUFFER_DESCRIPTION TEXT_AMF_H264("VBVBuffer.Description")
-#define AMF_H264_VBVBUFFER_STRICTNESS TEXT_AMF_H264("VBVBuffer.Strictness")
-#define AMF_H264_VBVBUFFER_STRICTNESS_DESCRIPTION TEXT_AMF_H264("VBVBuffer.Strictness.Description")
-#define AMF_H264_VBVBUFFER_SIZE TEXT_AMF_H264("VBVBuffer.Size")
-#define AMF_H264_VBVBUFFER_SIZE_DESCRIPTION TEXT_AMF_H264("VBVBuffer.Size.Description")
-#define AMF_H264_VBVBUFFER_FULLNESS TEXT_AMF_H264("VBVBuffer.Fullness")
-#define AMF_H264_VBVBUFFER_FULLNESS_DESCRIPTION TEXT_AMF_H264("VBVBuffer.Fullness.Description")
-#define AMF_H264_FILLERDATA TEXT_AMF_H264("FillerData")
-#define AMF_H264_FILLERDATA_DESCRIPTION TEXT_AMF_H264("FillerData.Description")
-#define AMF_H264_FRAMESKIPPING TEXT_AMF_H264("FrameSkipping")
-#define AMF_H264_FRAMESKIPPING_DESCRIPTION TEXT_AMF_H264("FrameSkipping.Description")
-#define AMF_H264_ENFORCEHRDCOMPATIBILITY TEXT_AMF_H264("EnforceHRDCompatibility")
-#define AMF_H264_ENFORCEHRDCOMPATIBILITY_DESCRIPTION TEXT_AMF_H264("EnforceHRDCompatibility.Description")
-
-// Picture Control Properties
-#define AMF_H264_KEYFRAME_INTERVAL TEXT_AMF_H264("KeyframeInterval")
-#define AMF_H264_KEYFRAME_INTERVAL_DESCRIPTION TEXT_AMF_H264("KeyframeInterval.Description")
-#define AMF_H264_IDR_PERIOD TEXT_AMF_H264("IDRPeriod")
-#define AMF_H264_IDR_PERIOD_DESCRIPTION TEXT_AMF_H264("IDRPeriod.Description")
-#define AMF_H264_BFRAME_PATTERN TEXT_AMF_H264("BFrame.Pattern")
-#define AMF_H264_BFRAME_PATTERN_DESCRIPTION TEXT_AMF_H264("BFrame.Pattern.Description")
-#define AMF_H264_BFRAME_DELTAQP TEXT_AMF_H264("BFrame.DeltaQP")
-#define AMF_H264_BFRAME_DELTAQP_DESCRIPTION TEXT_AMF_H264("BFrame.DeltaQP.Description")
-#define AMF_H264_BFRAME_REFERENCE TEXT_AMF_H264("BFrame.Reference")
-#define AMF_H264_BFRAME_REFERENCE_DESCRIPTION TEXT_AMF_H264("BFrame.Reference.Description")
-#define AMF_H264_BFRAME_REFERENCEDELTAQP TEXT_AMF_H264("BFrame.ReferenceDeltaQP")
-#define AMF_H264_BFRAME_REFERENCEDELTAQP_DESCRIPTION TEXT_AMF_H264("BFrame.ReferenceDeltaQP.Description")
-#define AMF_H264_DEBLOCKINGFILTER TEXT_AMF_H264("DeblockingFilter")
-#define AMF_H264_DEBLOCKINGFILTER_DESCRIPTION TEXT_AMF_H264("DeblockingFilter.Description")
-
-// Miscellaneous Properties
-#define AMF_H264_SCANTYPE TEXT_AMF_H264("ScanType")
-#define AMF_H264_SCANTYPE_DESCRIPTION TEXT_AMF_H264("ScanType.Description")
-#define AMF_H264_SCANTYPE_PROGRESSIVE TEXT_AMF_H264("ScanType.Progressive")
-#define AMF_H264_SCANTYPE_INTERLACED TEXT_AMF_H264("ScanType.Interlaced")
-#define AMF_H264_MOTIONESTIMATION TEXT_AMF_H264("MotionEstimation")
-#define AMF_H264_MOTIONESTIMATION_DESCRIPTION TEXT_AMF_H264("MotionEstimation.Description")
-#define AMF_H264_MOTIONESTIMATION_NONE TEXT_AMF_H264("MotionEstimation.None")
-#define AMF_H264_MOTIONESTIMATION_HALF TEXT_AMF_H264("MotionEstimation.Half")
-#define AMF_H264_MOTIONESTIMATION_QUARTER TEXT_AMF_H264("MotionEstimation.Quarter")
-#define AMF_H264_MOTIONESTIMATION_BOTH TEXT_AMF_H264("MotionEstimation.Both")
-
-// Experimental Properties
-#define AMF_H264_CODINGTYPE TEXT_AMF_H264("CodingType")
-#define AMF_H264_CODINGTYPE_DESCRIPTION TEXT_AMF_H264("CodingType.Description")
-#define AMF_H264_MAXIMUMLTRFRAMES TEXT_AMF_H264("MaximumLTRFrames")
-#define AMF_H264_MAXIMUMLTRFRAMES_DESCRIPTION TEXT_AMF_H264("MaximumLTRFrames.Description")
-#define AMF_H264_MAXIMUMACCESSUNITSIZE TEXT_AMF_H264("MaximumAccessUnitSize")
-#define AMF_H264_MAXIMUMACCESSUNITSIZE_DESCRIPTION TEXT_AMF_H264("MaximumAccessUnitSize.Description")
-#define AMF_H264_HEADER_INSERTION_SPACING TEXT_AMF_H264("HeaderInsertionSpacing")
-#define AMF_H264_HEADER_INSERTION_SPACING_DESCRIPTION TEXT_AMF_H264("HeaderInsertionSpacing.Description")
-#define AMF_H264_WAITFORTASK TEXT_AMF_H264("WaitForTask")
-#define AMF_H264_WAITFORTASK_DESCRIPTION TEXT_AMF_H264("WaitForTask.Description")
-#define AMF_H264_PREANALYSISPASS TEXT_AMF_H264("PreanalysisPass")
-#define AMF_H264_PREANALYSISPASS_DESCRIPTION TEXT_AMF_H264("PreanalysisPass.Description")
-#define AMF_H264_VBAQ TEXT_AMF_H264("VBAQ")
-#define AMF_H264_VBAQ_DESCRIPTION TEXT_AMF_H264("VBAQ.Description")
-#define AMF_H264_GOPSIZE TEXT_AMF_H264("GOPSize")
-#define AMF_H264_GOPSIZE_DESCRIPTION TEXT_AMF_H264("GOPSize.Description")
-#define AMF_H264_GOPALIGNMENT TEXT_AMF_H264("GOPAlignment")
-#define AMF_H264_GOPALIGNMENT_DESCRIPTION TEXT_AMF_H264("GOPAlignment.Description")
-#define AMF_H264_MAXIMUMREFERENCEFRAMES TEXT_AMF_H264("MaximumReferenceFrames")
-#define AMF_H264_MAXIMUMREFERENCEFRAMES_DESCRIPTION TEXT_AMF_H264("MaximumReferenceFrames.Description")
-#define AMF_H264_SLICESPERFRAME TEXT_AMF_H264("SlicesPerFrame")
-#define AMF_H264_SLICESPERFRAME_DESCRIPTION TEXT_AMF_H264("SlicesPerFrame.Description")
-#define AMF_H264_SLICEMODE TEXT_AMF_H264("SliceMode")
-#define AMF_H264_SLICEMODE_DESCRIPTION TEXT_AMF_H264("SliceMode.Description")
-#define AMF_H264_MAXIMUMSLICESIZE TEXT_AMF_H264("MaximumSliceSize")
-#define AMF_H264_MAXIMUMSLICESIZE_DESCRIPTION TEXT_AMF_H264("MaximumSliceSize.Description")
-#define AMF_H264_SLICECONTROLMODE TEXT_AMF_H264("SliceControlMode")
-#define AMF_H264_SLICECONTROLMODE_DESCRIPTION TEXT_AMF_H264("SliceControlMode.Description")
-#define AMF_H264_SLICECONTROLSIZE TEXT_AMF_H264("SliceControlSize")
-#define AMF_H264_SLICECONTROLSIZE_DESCRIPTION TEXT_AMF_H264("SliceControlSize.Description")
-#define AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES TEXT_AMF_H264("IntraRefresh.NumberOfStripes")
-#define AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES_DESCRIPTION TEXT_AMF_H264("IntraRefresh.NumberOfStripes.Description")
-#define AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT TEXT_AMF_H264("IntraRefresh.MacroblocksPerSlot")
-#define AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT_DESCRIPTION TEXT_AMF_H264("IntraRefresh.MacroblocksPerSlot.Description")
-
-// System Properties
-#define AMF_H264_VIDEOAPI TEXT_AMF_H264("VideoAPI")
-#define AMF_H264_VIDEOAPI_DESCRIPTION TEXT_AMF_H264("VideoAPI.Description")
-#define AMF_H264_VIDEOADAPTER TEXT_AMF_H264("VideoAdapter")
-#define AMF_H264_VIDEOADAPTER_DESCRIPTION TEXT_AMF_H264("VideoAdapter.Description")
-#define AMF_H264_OPENCL TEXT_AMF_H264("OpenCL")
-#define AMF_H264_OPENCL_DESCRIPTION TEXT_AMF_H264("OpenCL.Description")
-#define AMF_H264_VIEW TEXT_AMF_H264("View")
-#define AMF_H264_VIEW_DESCRIPTION TEXT_AMF_H264("View.Description")
-#define AMF_H264_VIEW_BASIC TEXT_AMF_H264("View.Basic")
-#define AMF_H264_VIEW_ADVANCED TEXT_AMF_H264("View.Advanced")
-#define AMF_H264_VIEW_EXPERT TEXT_AMF_H264("View.Expert")
-#define AMF_H264_VIEW_MASTER TEXT_AMF_H264("View.Master")
-#define AMF_H264_DEBUG TEXT_AMF_H264("Debug")
-#define AMF_H264_DEBUG_DESCRIPTION TEXT_AMF_H264("Debug.Description")
-#define AMF_H264_VERSION TEXT_AMF_H264("Version")
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
namespace Plugin {
namespace Interface {
class H264Interface {
// Storage
//////////////////////////////////////////////////////////////////////////
private:
- Plugin::AMD::H264Encoder* m_VideoEncoder;
+ std::unique_ptr<Plugin::AMD::EncoderH264> m_VideoEncoder;
+ obs_encoder_t* m_Encoder;
};
}
}
\ No newline at end of file
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/enc-h265.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+#include "amf-encoder-h265.h"
+#include "plugin.h"
+
+namespace Plugin {
+ namespace Interface {
+ class H265Interface {
+ public:
+ static void encoder_register();
+ static const char* get_name(void* type_data);
+ static void get_defaults(obs_data_t *data);
+ static obs_properties_t* get_properties(void* ptr);
+
+ static bool properties_modified(obs_properties_t *props, obs_property_t *, obs_data_t *data);
+
+ static void* create(obs_data_t* settings, obs_encoder_t* encoder);
+ static void destroy(void* ptr);
+ static bool update(void *ptr, obs_data_t *data);
+ static bool encode(void *ptr, struct encoder_frame * frame, struct encoder_packet * packet, bool * received_packet);
+ static void get_video_info(void *ptr, struct video_scale_info *info);
+ static bool get_extra_data(void *ptr, uint8_t** extra_data, size_t* size);
+
+ //////////////////////////////////////////////////////////////////////////
+ // Module Code
+ //////////////////////////////////////////////////////////////////////////
+ public:
+
+ H265Interface(obs_data_t* data, obs_encoder_t* encoder);
+ ~H265Interface();
+
+ bool update(obs_data_t* data);
+ bool encode(struct encoder_frame * frame, struct encoder_packet * packet, bool * received_packet);
+ void get_video_info(struct video_scale_info* info);
+ bool get_extra_data(uint8_t** extra_data, size_t* size);
+
+ //////////////////////////////////////////////////////////////////////////
+ // Storage
+ //////////////////////////////////////////////////////////////////////////
+ private:
+ std::unique_ptr<Plugin::AMD::EncoderH265> m_VideoEncoder;
+ obs_encoder_t* m_Encoder;
+ };
+ }
+}
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Include/plugin.h -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/plugin.h
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
+#define NOMINMAX
+#define NOINOUT
-// Microsoft as always does not follow the standard and declares safe functions unsafe.
-// Or even straight up marks them as deprecated, what the fuck Microsoft?
-#ifdef _MSC_VER
-#define _CRT_SECURE_NO_WARNINGS
-#pragma warning(disable : 4996)
-#endif
-
-#include <cstdint>
-#include <inttypes.h>
-#include <exception>
-#include <stdexcept>
-#include <thread>
-#include <memory>
-
-// Open Broadcaster Software
-#pragma warning( disable: 4201 )
+#pragma warning (push)
+#pragma warning (disable: 4201)
#include "libobs/obs-module.h"
-#include "libobs/obs-encoder.h"
+#include "libobs/util/platform.h"
+#pragma warning (pop)
-MODULE_EXTERN const char *obs_module_text_multi(const char *val, uint8_t depth = (uint8_t)1);
+// Plugin
+#define PLUGIN_NAME "AMD Advanced Media Framework"
+#include "Version.h"
-//////////////////////////////////////////////////////////////////////////
-// Defines
-//////////////////////////////////////////////////////////////////////////
+#define PLOG(level, ...) blog(level, "[AMF] " __VA_ARGS__);
+#define PLOG_ERROR(...) PLOG(LOG_ERROR, __VA_ARGS__)
+#define PLOG_WARNING(...) PLOG(LOG_WARNING, __VA_ARGS__)
+#define PLOG_INFO(...) PLOG(LOG_INFO, __VA_ARGS__)
+#define PLOG_DEBUG(...) PLOG(LOG_DEBUG, __VA_ARGS__)
+// Utility
#define vstr(s) dstr(s)
#define dstr(s) #s
-#define clamp(val,low,high) (val > high ? high : (val < low ? low : val))
-#include "Version.h"
-#define PLUGIN_VERSION_FULL (((uint64_t)PLUGIN_VERSION_MAJOR << 48ull) | ((uint64_t)PLUGIN_VERSION_MINOR << 32ull) | ((uint64_t)PLUGIN_VERSION_PATCH) << 16ul | ((uint64_t)PLUGIN_VERSION_BUILD))
-#define PLUGIN_VERSION_TEXT vstr(PLUGIN_VERSION_MAJOR) "." vstr(PLUGIN_VERSION_MINOR) "." vstr(PLUGIN_VERSION_PATCH) "." vstr(PLUGIN_VERSION_BUILD)
+#define clamp(val,low,high) (val > high ? high : (val < low ? low : val))
+#ifdef max
+#undef max
+#endif
+#define max(val,high) (val > high ? val : high)
+#ifdef min
+#undef min
+#endif
+#define min(val,low ) (val < low ? val : low)
-#define AMF_LOG(level, format, ...) blog(level, "[AMF Encoder] " format, ##__VA_ARGS__);
-#define AMF_LOG_ERROR(format, ...) AMF_LOG(LOG_ERROR, format, ##__VA_ARGS__)
-#define AMF_LOG_WARNING(format, ...) AMF_LOG(LOG_WARNING, format, ##__VA_ARGS__)
-#define AMF_LOG_INFO(format, ...) AMF_LOG(LOG_INFO, format, ##__VA_ARGS__)
-#define AMF_LOG_CONFIG(format, ...) AMF_LOG(350, format, ##__VA_ARGS__)
-#define AMF_LOG_DEBUG(format, ...) AMF_LOG(LOG_DEBUG, format, ##__VA_ARGS__)
+#ifdef IN
+#undef IN
+#endif
+#define IN
+#ifdef OUT
+#undef OUT
+#endif
+#define OUT
-#define ThrowException(format, ...) {\
- std::vector<char> _throwexceptionwithamferror_buf(8192);\
- sprintf_s(_throwexceptionwithamferror_buf.data(), _throwexceptionwithamferror_buf.size(), format, ##__VA_ARGS__);\
- AMF_LOG_WARNING("%s", _throwexceptionwithamferror_buf.data()); \
- throw std::exception(_throwexceptionwithamferror_buf.data()); \
-}
-#define ThrowExceptionWithAMFError(format, res, ...) {\
- std::vector<char> _throwexceptionwithamferror_buf(8192);\
- sprintf_s(_throwexceptionwithamferror_buf.data(), _throwexceptionwithamferror_buf.size(), format, ##__VA_ARGS__, Plugin::AMD::AMF::GetInstance()->GetTrace()->GetResultText(res), res);\
- AMF_LOG_WARNING("%s", _throwexceptionwithamferror_buf.data()); \
- throw std::exception(_throwexceptionwithamferror_buf.data()); \
-}
+#define QUICK_FORMAT_MESSAGE(var, ...) std::string var = ""; { \
+ std::vector<char> QUICK_FORMAT_MESSAGE_buf(1024); \
+ snprintf(QUICK_FORMAT_MESSAGE_buf.data(), QUICK_FORMAT_MESSAGE_buf.size(), __VA_ARGS__); \
+ var = std::string(QUICK_FORMAT_MESSAGE_buf.data()); \
+ }
#ifndef __FUNCTION_NAME__
#if defined(_WIN32) || defined(_WIN64) //WINDOWS
#endif
#endif
-//////////////////////////////////////////////////////////////////////////
-// Defines - Translation Strings
-//////////////////////////////////////////////////////////////////////////
-
-#define TEXT_T(x) obs_module_text_multi(x)
-#define TEXT_AMF(x) ("AMF." ## x)
-#define TEXT_AMF_H264(x) (TEXT_AMF("H264." ## x))
-#define TEXT_AMF_UTIL(x) (TEXT_AMF("Util." ## x))
-
-// Utility
-#define AMF_UTIL_DEFAULT TEXT_AMF_UTIL("Default")
-#define AMF_UTIL_AUTOMATIC TEXT_AMF_UTIL("Automatic")
-#define AMF_UTIL_MANUAL TEXT_AMF_UTIL("Manual")
-#define AMF_UTIL_TOGGLE_DISABLED TEXT_AMF_UTIL("Toggle.Disabled")
-#define AMF_UTIL_TOGGLE_ENABLED TEXT_AMF_UTIL("Toggle.Enabled")
-
-//////////////////////////////////////////////////////////////////////////
-// Threading Specific
-//////////////////////////////////////////////////////////////////////////
-
-#if (defined _WIN32) || (defined _WIN64)
-void SetThreadName(uint32_t dwThreadID, const char* threadName);
-void SetThreadName(const char* threadName);
-void SetThreadName(std::thread* thread, const char* threadName);
-
-#else
-void SetThreadName(std::thread* thread, const char* threadName);
-void SetThreadName(const char* threadName);
-
-#endif
+enum class Presets : int8_t {
+ None = -1,
+ ResetToDefaults = 0,
+ Recording,
+ HighQuality,
+ Indistinguishable,
+ Lossless,
+ Twitch,
+ YouTube,
+};
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/strings.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+#include "plugin.h"
+#include "utility.h"
+
+#define P_TRANSLATE(x) Utility::obs_module_text_multi(x)
+#define P_DESC(x) x ".Description"
+
+// Shared
+#define P_VERSION "Version"
+#define P_UTIL_DEFAULT "Utility.Default"
+#define P_UTIL_AUTOMATIC "Utility.Automatic"
+#define P_UTIL_MANUAL "Utility.Manual"
+#define P_UTIL_SWITCH_DISABLED "Utility.Switch.Disabled"
+#define P_UTIL_SWITCH_ENABLED "Utility.Switch.Enabled"
+
+// Presets
+#define P_PRESET "Preset"
+#define P_PRESET_RESETTODEFAULTS "Preset.ResetToDefaults"
+#define P_PRESET_RECORDING "Preset.Recording"
+#define P_PRESET_HIGHQUALITY "Preset.HighQuality"
+#define P_PRESET_INDISTINGUISHABLE "Preset.Indistinguishable"
+#define P_PRESET_LOSSLESS "Preset.Lossless"
+#define P_PRESET_TWITCH "Preset.Twitch"
+#define P_PRESET_YOUTUBE "Preset.YouTube"
+
+// Static
+//#define P_USAGE "Usage"
+//#define P_USAGE_TRANSCODING "Usage.Transcoding"
+//#define P_USAGE_ULTRALOWLATENCY "Usage.UltraLowLatency"
+//#define P_USAGE_LOWLATENCY "Usage.LowLatency"
+//#define P_USAGE_WEBCAM "Usage.Webcam"
+#define P_QUALITYPRESET "QualityPreset"
+#define P_QUALITYPRESET_SPEED "QualityPreset.Speed"
+#define P_QUALITYPRESET_BALANCED "QualityPreset.Balanced"
+#define P_QUALITYPRESET_QUALITY "QualityPreset.Quality"
+#define P_PROFILE "Profile"
+#define P_PROFILELEVEL "ProfileLevel"
+#define P_TIER "Tier"
+#define P_ASPECTRATIO "AspectRatio"
+#define P_CODINGTYPE "CodingType"
+#define P_CODINGTYPE_CABAC "CodingType.CABAC"
+#define P_CODINGTYPE_CAVLC "CodingType.CAVLC"
+#define P_MAXIMUMREFERENCEFRAMES "MaximumReferenceFrames"
+
+// Rate Control
+#define P_RATECONTROLMETHOD "RateControlMethod"
+#define P_RATECONTROLMETHOD_CQP "RateControlMethod.CQP"
+#define P_RATECONTROLMETHOD_CBR "RateControlMethod.CBR"
+#define P_RATECONTROLMETHOD_VBR "RateControlMethod.VBR"
+#define P_RATECONTROLMETHOD_VBRLAT "RateControlMethod.VBRLAT"
+#define P_PREPASSMODE "PrePassMode"
+#define P_PREPASSMODE_QUARTER "PrePassMode.Quarter"
+#define P_PREPASSMODE_HALF "PrePassMode.Half"
+#define P_PREPASSMODE_FULL "PrePassMode.Full"
+#define P_BITRATE_TARGET "Bitrate.Target"
+#define P_BITRATE_PEAK "Bitrate.Peak"
+#define P_QP_MINIMUM "QP.Minimum" // H264
+#define P_QP_MAXIMUM "QP.Maximum" // H264
+#define P_QP_IFRAME "QP.IFrame"
+#define P_QP_IFRAME_MINIMUM "QP.IFrame.Minimum" // H265
+#define P_QP_IFRAME_MAXIMUM "QP.IFrame.Maximum" // H265
+#define P_QP_PFRAME "QP.PFrame"
+#define P_QP_PFRAME_MINIMUM "QP.PFrame.Minimum" // H265
+#define P_QP_PFRAME_MAXIMUM "QP.PFrame.Maximum" // H265
+#define P_QP_BFRAME "QP.BFrame" // H264
+#define P_FILLERDATA "FillerData"
+#define P_FRAMESKIPPING "FrameSkipping"
+#define P_FRAMESKIPPING_PERIOD "FrameSkipping.Period"
+#define P_FRAMESKIPPING_BEHAVIOUR "FrameSkipping.Behaviour"
+#define P_FRAMESKIPPING_SKIPNTH "FrameSkipping.SkipNth"
+#define P_FRAMESKIPPING_KEEPNTH "FrameSkipping.KeepNth"
+#define P_VBAQ "VBAQ"
+#define P_ENFORCEHRD "EnforceHRD"
+
+// VBV Buffer
+#define P_VBVBUFFER "VBVBuffer"
+#define P_VBVBUFFER_SIZE "VBVBuffer.Size"
+#define P_VBVBUFFER_STRICTNESS "VBVBuffer.Strictness"
+#define P_VBVBUFFER_INITIALFULLNESS "VBVBuffer.InitialFullness"
+
+// Picture Control
+#define P_INTERVAL_KEYFRAME "Interval.Keyframe"
+#define P_PERIOD_IDR_H264 "Period.IDR.H264" // H264
+#define P_PERIOD_IDR_H265 "Period.IDR.H265" // H265
+#define P_INTERVAL_IFRAME "Interval.IFrame"
+#define P_PERIOD_IFRAME "Period.IFrame"
+#define P_INTERVAL_PFRAME "Interval.PFrame"
+#define P_PERIOD_PFRAME "Period.PFrame"
+#define P_INTERVAL_BFRAME "Interval.BFrame"
+#define P_PERIOD_BFRAME "Period.BFrame"
+#define P_GOP_TYPE "GOP.Type" // H265
+#define P_GOP_TYPE_FIXED "GOP.Type.Fixed" // H265
+#define P_GOP_TYPE_VARIABLE "GOP.Type.Variable" // H265
+#define P_GOP_SIZE "GOP.Size" // H265
+#define P_GOP_SIZE_MINIMUM "GOP.Size.Minimum" // H265
+#define P_GOP_SIZE_MAXIMUM "GOP.Size.Maximum" // H265
+#define P_GOP_ALIGNMENT "GOP.Alignment" // Both?
+#define P_BFRAME_PATTERN "BFrame.Pattern" // H264
+#define P_BFRAME_DELTAQP "BFrame.DeltaQP" // H264
+#define P_BFRAME_REFERENCE "BFrame.Reference" // H264
+#define P_BFRAME_REFERENCEDELTAQP "BFrame.ReferenceDeltaQP" // H264
+#define P_DEBLOCKINGFILTER "DeblockingFilter"
+#define P_MOTIONESTIMATION "MotionEstimation"
+#define P_MOTIONESTIMATION_QUARTER "MotionEstimation.Quarter"
+#define P_MOTIONESTIMATION_HALF "MotionEstimation.Half"
+#define P_MOTIONESTIMATION_FULL "MotionEstimation.Full"
+
+// System
+#define P_VIDEO_API "Video.API"
+#define P_VIDEO_ADAPTER "Video.Adapter"
+#define P_OPENCL_TRANSFER "OpenCL.Transfer"
+#define P_OPENCL_CONVERSION "OpenCL.Conversion"
+#define P_ASYNCHRONOUSQUEUE "AsynchronousQueue"
+#define P_ASYNCHRONOUSQUEUE_SIZE "AsynchronousQueue.Size"
+#define P_DEBUG "Debug"
+
+#define P_VIEW "View"
+#define P_VIEW_BASIC "View.Basic"
+#define P_VIEW_ADVANCED "View.Advanced"
+#define P_VIEW_EXPERT "View.Expert"
+#define P_VIEW_MASTER "View.Master"
+enum class ViewMode :uint8_t {
+ Basic,
+ Advanced,
+ Expert,
+ Master
+};
+
+/// Other - Missing Functionality
+//#define AMF_H264_MAXIMUMLTRFRAMES TEXT_AMF_H264("MaximumLTRFrames")
+//#define AMF_H264_MAXIMUMLTRFRAMES_DESCRIPTION TEXT_AMF_H264("MaximumLTRFrames.Description")
+//#define AMF_H264_MAXIMUMACCESSUNITSIZE TEXT_AMF_H264("MaximumAccessUnitSize")
+//#define AMF_H264_MAXIMUMACCESSUNITSIZE_DESCRIPTION TEXT_AMF_H264("MaximumAccessUnitSize.Description")
+//#define AMF_H264_HEADER_INSERTION_SPACING TEXT_AMF_H264("HeaderInsertionSpacing")
+//#define AMF_H264_HEADER_INSERTION_SPACING_DESCRIPTION TEXT_AMF_H264("HeaderInsertionSpacing.Description")
+//#define AMF_H264_SLICESPERFRAME TEXT_AMF_H264("SlicesPerFrame")
+//#define AMF_H264_SLICESPERFRAME_DESCRIPTION TEXT_AMF_H264("SlicesPerFrame.Description")
+//#define AMF_H264_SLICEMODE TEXT_AMF_H264("SliceMode")
+//#define AMF_H264_SLICEMODE_DESCRIPTION TEXT_AMF_H264("SliceMode.Description")
+//#define AMF_H264_MAXIMUMSLICESIZE TEXT_AMF_H264("MaximumSliceSize")
+//#define AMF_H264_MAXIMUMSLICESIZE_DESCRIPTION TEXT_AMF_H264("MaximumSliceSize.Description")
+//#define AMF_H264_SLICECONTROLMODE TEXT_AMF_H264("SliceControlMode")
+//#define AMF_H264_SLICECONTROLMODE_DESCRIPTION TEXT_AMF_H264("SliceControlMode.Description")
+//#define AMF_H264_SLICECONTROLSIZE TEXT_AMF_H264("SliceControlSize")
+//#define AMF_H264_SLICECONTROLSIZE_DESCRIPTION TEXT_AMF_H264("SliceControlSize.Description")
+//#define AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES TEXT_AMF_H264("IntraRefresh.NumberOfStripes")
+//#define AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES_DESCRIPTION TEXT_AMF_H264("IntraRefresh.NumberOfStripes.Description")
+//#define AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT TEXT_AMF_H264("IntraRefresh.MacroblocksPerSlot")
+//#define AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT_DESCRIPTION TEXT_AMF_H264("IntraRefresh.MacroblocksPerSlot.Description")
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Include/utility.h
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+#include "amf.h"
+#include "amf-encoder.h"
+#include "components/VideoConverter.h"
+#ifdef WITH_AVC
+#include "amf-encoder-h264.h"
+#include "components/VideoEncoderVCE.h"
+#endif
+#ifdef WITH_HEVC
+#include "amf-encoder-h265.h"
+#include "components/VideoEncoderHEVC.h"
+#endif
+
+namespace Utility {
+ uint64_t GetUniqueIdentifier();
+ const char *obs_module_text_multi(const char *val, uint8_t depth = (uint8_t)1);
+
+ // Codec
+ const char* CodecToString(Plugin::AMD::Codec v);
+ const wchar_t* CodecToAMF(Plugin::AMD::Codec v);
+
+ // Color Format
+ const char* ColorFormatToString(Plugin::AMD::ColorFormat v);
+ amf::AMF_SURFACE_FORMAT ColorFormatToAMF(Plugin::AMD::ColorFormat v);
+
+ // Color Space
+ const char* ColorSpaceToString(Plugin::AMD::ColorSpace v);
+ AMF_VIDEO_CONVERTER_COLOR_PROFILE_ENUM ColorSpaceToAMFConverter(Plugin::AMD::ColorSpace v);
+
+ // Usage
+ const char* UsageToString(Plugin::AMD::Usage v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_USAGE_ENUM UsageToAMFH264(Plugin::AMD::Usage v);
+ Plugin::AMD::Usage UsageFromAMFH264(AMF_VIDEO_ENCODER_USAGE_ENUM v);
+ #endif
+ #ifdef WITH_HEVC
+ AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM UsageToAMFH265(Plugin::AMD::Usage v);
+ Plugin::AMD::Usage UsageFromAMFH265(AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM v);
+ #endif
+
+ // Quality Preset
+ const char* QualityPresetToString(Plugin::AMD::QualityPreset v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM QualityPresetToAMFH264(Plugin::AMD::QualityPreset v);
+ Plugin::AMD::QualityPreset QualityPresetFromAMFH264(AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM v);
+ #endif
+ #ifdef WITH_HEVC
+ AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM QualityPresetToAMFH265(Plugin::AMD::QualityPreset v);
+ Plugin::AMD::QualityPreset QualityPresetFromAMFH265(AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM v);
+ #endif
+
+ // Profile
+ const char* ProfileToString(Plugin::AMD::Profile v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_PROFILE_ENUM ProfileToAMFH264(Plugin::AMD::Profile v);
+ Plugin::AMD::Profile ProfileFromAMFH264(AMF_VIDEO_ENCODER_PROFILE_ENUM v);
+ #endif
+ #ifdef WITH_HEVC
+ AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM ProfileToAMFH265(Plugin::AMD::Profile v);
+ Plugin::AMD::Profile ProfileFromAMFH265(AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM v);
+ #endif
+
+ // Tier
+ #ifdef WITH_HEVC
+ const char* TierToString(Plugin::AMD::H265::Tier v);
+ AMF_VIDEO_ENCODER_HEVC_TIER_ENUM TierToAMFH265(Plugin::AMD::H265::Tier v);
+ Plugin::AMD::H265::Tier TierFromAMFH265(AMF_VIDEO_ENCODER_HEVC_TIER_ENUM v);
+ #endif
+
+ // Coding Type
+ const char* CodingTypeToString(Plugin::AMD::CodingType v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_CODING_ENUM CodingTypeToAMFH264(Plugin::AMD::CodingType v);
+ Plugin::AMD::CodingType CodingTypeFromAMFH264(AMF_VIDEO_ENCODER_CODING_ENUM v);
+ #endif
+ #ifdef WITH_HEVC
+ int64_t CodingTypeToAMFH265(Plugin::AMD::CodingType v);
+ Plugin::AMD::CodingType CodingTypeFromAMFH265(int64_t v);
+ #endif
+
+ // Rate Control Method
+ const char* RateControlMethodToString(Plugin::AMD::RateControlMethod v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM RateControlMethodToAMFH264(Plugin::AMD::RateControlMethod v);
+ Plugin::AMD::RateControlMethod RateControlMethodFromAMFH264(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM v);
+ #endif
+ #ifdef WITH_HEVC
+ AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM RateControlMethodToAMFH265(Plugin::AMD::RateControlMethod v);
+ Plugin::AMD::RateControlMethod RateControlMethodFromAMFH265(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM v);
+ #endif
+
+ // Pre-Pass Method
+ const char* PrePassModeToString(Plugin::AMD::PrePassMode v);
+ #ifdef WITH_AVC
+ AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM PrePassModeToAMFH264(Plugin::AMD::PrePassMode v);
+ Plugin::AMD::PrePassMode PrePassModeFromAMFH264(AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM v);
+ #endif
+
+ // GOP Type
+ #ifdef WITH_HEVC
+ const char* GOPTypeToString(Plugin::AMD::H265::GOPType v);
+ Plugin::AMD::H265::GOPType GOPTypeFromAMFH265(int64_t v);
+ int64_t GOPTypeToAMFH265(Plugin::AMD::H265::GOPType v);
+ #endif
+
+ // Slicing
+ #ifdef WITH_AVC
+ const char* SliceModeToString(Plugin::AMD::H264::SliceMode v);
+ #endif
+ const char* SliceControlModeToString(Plugin::AMD::SliceControlMode v);
+
+ #ifdef WITH_AVC
+ Plugin::AMD::ProfileLevel H264ProfileLevel(std::pair<uint32_t, uint32_t> resolution, std::pair<uint32_t, uint32_t> frameRate);
+ #endif
+ #ifdef WITH_HEVC
+ Plugin::AMD::ProfileLevel H265ProfileLevel(std::pair<uint32_t, uint32_t> resolution, std::pair<uint32_t, uint32_t> frameRate);
+ #endif
+
+ //////////////////////////////////////////////////////////////////////////
+ // Threading Specific
+ //////////////////////////////////////////////////////////////////////////
+
+ #if (defined _WIN32) || (defined _WIN64)
+ void SetThreadName(uint32_t dwThreadID, const char* threadName);
+ void SetThreadName(const char* threadName);
+ void SetThreadName(std::thread* pthread, const char* threadName);
+ #else
+ void SetThreadName(std::thread* pthread, const char* threadName);
+ void SetThreadName(const char* threadName);
+ #endif
+}
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/LICENSE -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/LICENSE
Changed
The MIT License (MIT)
-Copyright (c) 2016
+Copyright (c) 2016-2017 Michael Fabian Dirks
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
obs-studio-18.0.1.tar.xz/plugins/enc-amf/README.md -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/README.md
Changed
-This is a plugin for OBS Studio that enables almost fully native hardware encoding instead of passing things off to Media Foundation Transform which is known to have issues with some certain settings. The plugin is based on the new AMF SDK and thus will work with any future driver releases.
+# What is this?
+This is a plugin for the [Open Broadcaster Software](https://obsproject.com/) project that adds support for AMD Hardware Encoding through the use of [AMDs Advanced Media Framework](https://github.com/GPUOpen-LibrariesAndSDKs/AMF).
-[Read More on the Wiki](https://github.com/Xaymar/obs-studio_amf-encoder-plugin/wiki)
+Starting with Open Broadcaster Studio Version 0.16.2, this Plugin replaced the old Media Foundation Transform based approach (which causes issues and slowness). Due to this, Open Broadcaster Software Studio is now a better choice than the old Open Broadcaster Software Classic VCE branch.
-## Contributing
+## Requirements
+The plugin requires that you have a supported AMD GPU and have the following installed and updated to the latest version:
-Read [CONTRIBUTING.md](https://github.com/Xaymar/obs-studio_amf-encoder-plugin/blob/master/CONTRIBUTING.md) for a guide on how to start.
+* [Visual Studio 2015 Redistributables](https://www.microsoft.com/en-us/download/details.aspx?id=48145)
+* [AMD Graphics Driver](https://support.amd.com/en-us/download)
+* [Open Broadcaster Software Studio](https://obsproject.com/download#mp)
+* [AMD Advanced Media Framework Plugin](https://github.com/Xaymar/obs-studio_amf-encoder-plugin/tags)
-# Contributors
-These people have helped (in one way or another) making this project possible. Without them, this project would have most likely not been where it is.
+# Read More
-| Who | What |
-| --- | ---- |
-| [Jim](https://github.com/jp9000) | Origin of the Open Broadcaster Software, provided support for development questions. |
-| [jackun](http://github.com/jackun) | The awesome person that started it all with his own Classic and Studio fork.<br>Thanks to him this project even exists today. |
-| [Xaymar](http://github.com/Xaymar) | Just the guy who spent a week staring at his monitors, trying to figure out how to make it work. |
-| GolDAce | CrowdIn integration. |
-| [leporel](https://github.com/leporel) | Provided ru-RU language files (Russian) |
-| [max20091](https://github.com/max20091) | Provided vi-VN language files (Vietnamese) |
-| [M4RK22](https://github.com/M4RK22) | Provided es-ES language files (Spanish) |
-| [niteforce](https://github.com/niteforce) | Provided hu-HU language files (Hungarian) |
-| [nwgat](https://github.com/nwgat) | Provided nb-NO language files (Norwegian) |
-| [wazerstar](https://github.com/wazerstar) | Provided da-DK language files (Danish) |
-| AMD | Providing Media SDK and later providing the AMF SDK. Also for incredibly fast fixing of bugs. |
-
-## The Amazing Supporters
-Special thanks go out to those that have either donated to the project directly or have decided to put some of their spare money into [my Patreon](https://patreon.com/xaymar). You guys rock!
-
-**[Jim](https://github.com/jp9000)**
-Basically created the entire OBS project, without him it wouldn't even be here.
-
-**Kytos/M4RK22** <!-- https://www.patreon.com/user?u=3762404 -->
-Im happy to support a nice recording plugin for AMD users.
-Patron: 2016 August-November
-[Website](https://markitos.ovh), [Steam](http://steamcommunity.com/id/markitos22/)
-
-**nwgat.ninja** <!-- https://www.patreon.com/user?u=2885495 -->
-nwgat.ninja is proud to support Xaymars Technology Projects.
-Patron: 2016 August-November
-[Website](https://nwgat.ninja)
-
-**Mattheus Wiegand**
-Patron: 2016 August, September
-[Twitter](https://twitter.com/Morphy2k/), [GitHub](https://github.com/Morphy2k)
-
-**Jeremy "razorlikes" Nieth** <!-- https://www.patreon.com/user?u=2463662 -->
-I like to support this project because it gives me a way to stream without having to sacrifice immense amounts of cpu resources for encoding.
-Patron: 2016 August-November
-[Twitch](https://twitch.tv/razorlikes), [GitHub](https://github.com/razorlikes)
-
-**Kristian Kirkesæther** <!-- https://www.patreon.com/user?u=3963961 -->
-Patron: 2016 September-November
-
-**vektorDex** <!-- https://www.patreon.com/vDex -->
-Patron: 2016 September-November
-[Website](http://blog-of-dex.de/), [Twitter](https://twitter.com/vektordex), [Studio](http://digitaldawnstudios.com)
-
-**AJ** <!-- https://www.patreon.com/user?u=3931856 -->
-Patron: 2016 September-November
-
-**SneakyJoe** <!-- https://www.patreon.com/sneaky4oe -->
-Russian streamer and stream teacher, AMD fanboy. Wants to make AMD great again.
-Patron: 2016 September-November
-[Website](http://sneakyjoe.ru/), [YouTube](https://www.youtube.com/channel/UCUmRv5GwQcsnxXRzuPCGr-Q)
-
-**Nicholas Kreimeyer** <!-- https://www.patreon.com/user?u=280867 -->
-Patron: 2016 October, November
-
-**noext** <!-- https://www.patreon.com/user?u=3209509 -->
-Patron: 2016 October
-
-**John Difool** <!-- https://www.patreon.com/user?u=3972864 -->
-John Difool der alte Sack
-Patron: 2016 October, November
-[YouTube](https://www.youtube.com/channel/UC5FPsFLQh4ah0-vz-eoZlOA)
-
-**DaOrgest** <!-- https://www.patreon.com/daorgest -->
-Currently studying computer and I do YouTube for a hobby
-Patron: 2016 September-November
-[Website](http://daorgest.me), [YouTube](http://youtube.com/daorgest)
-
-**Nucu** <!-- https://www.patreon.com/user?u=187366 -->
-Thanks AMD and Xaymar to make hardware encoding possible.
-Patron: 2016 August-November
-
-**Daniel Bagge** <!-- https://www.patreon.com/user?u=2457937 -->
-Patron: 2016 September-November
-
-**Cihangir Ceviren** <!-- https://www.patreon.com/user?u=4509018 -->
-Patron: 2016 November
-
-# MIT License
-
-Copyright (c) 2016 Michael Fabian Dirks
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
\ No newline at end of file
+You can [read more about it on the Wiki](https://github.com/Xaymar/obs-studio_amf-encoder-plugin/wikis/home)!
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/ca-ES.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/ca-ES.ini
Changed
-AMF.Util.Default="Per defecte"
-AMF.Util.Automatic="Automàtic"
-AMF.Util.Manual="Manual"
-AMF.Util.Toggle.Disabled="Desactivat"
-AMF.Util.Toggle.Enabled="Activat"
-AMF.H264.Preset="Configuració preestablerta"
-AMF.H264.Preset.ResetToDefaults="Restableix als valors per defecte"
-AMF.H264.Preset.Recording="S'està enregistrant"
-AMF.H264.Preset.HighQuality="Alta qualitat"
-AMF.H264.Preset.Indistinguishable="Indistinguible"
-AMF.H264.Preset.Lossless="Sense pèrdues"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Ús"
-AMF.H264.Usage.Description="Quin ús s'ha d'ajustar AMF:\n- 'Codificació' és per a ús generalitzat (recomanat),\n- 'Latència Ultra Baixa' és per a una codificació de molt baixa latència,\n- 'Latència Baixa' és similar a l'anterior amb alguna cosa mes de latència.\nLes retransmissions en directe només suporten 'Codificació', tots els valors poden ser utilitzats per a enregistrar."
-AMF.H264.Usage.Transcoding="Transcodificació"
-AMF.H264.Usage.UltraLowLatency="Latència ultra baixa"
-AMF.H264.Usage.LowLatency="Latència baixa"
-AMF.H264.QualityPreset="Qualitat del perfil"
-AMF.H264.QualityPreset.Description="Quina qualitat del perfil d'AMD s'ha d'intentar aconseguir:\n- 'Velocitat' és la més ràpida però la que pitjor qualitat obté,\n- 'Equilibrat' està entre 'Velocitat' i 'Qualitat' oferint un balanç entre els dos,\n- 'Qualitat' ofereix la millor qualitat possible per una determinada tassa de marcs."
-AMF.H264.QualityPreset.Speed="Velocitat"
-AMF.H264.QualityPreset.Balanced="Equilibrat"
-AMF.H264.QualityPreset.Quality="Qualitat"
-AMF.H264.Profile="Perfil"
-AMF.H264.Profile.Description="Quin perfil H.264 s'ha d'utilitzar per la codificació, ordenats de major qualitat al més suportat."
-AMF.H264.ProfileLevel="Nivell de perfil"
-AMF.H264.ProfileLevel.Description="Nivell de perfil H.264 a utilitzar per la codificació:\n- 'Automàtic' calcula el millor nivell de perfil per certa velocitat i mida de marcs,\n- '4.1' suporta 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' suporta 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' suporta 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' suporta 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' suporta 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Mètode de control del flux"
-AMF.H264.RateControlMethod.Description="Quin mètode de control de flux s'ha d'utilitzar:\n- '\@AMF.H264.RateControlMethod.CQP\@' assigna valors fixos de QP a I-/P-/B-Frames (Paràmetre de quantització),\n- '\@AMF.H264.RateControlMethod.CBR\@' es manté en la tassa de marcs objectiu (utilitzant dades de farciment) (recomanat per transmissions en directe),\n- '\@AMF.H264.RateControlMethod.VBR\@' es manté per sota d'un pic de tassa de marcs,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' es manté prop de la tassa de marcs desitjada si la latència i carrega de la GPU ho permet, si no s'augmentarà la taxa de marcs (recomanat per a enregistraments)."
-AMF.H264.RateControlMethod.CQP="QP constant (CQP)"
-AMF.H264.RateControlMethod.CBR="Flux constant (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Flux variable (pic restringit)(VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Flux variable (latència restringida) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Tassa de bits desitjada"
-AMF.H264.Bitrate.Target.Description="Tassa de marcs a intentar arribar a la seqüència general."
-AMF.H264.Bitrate.Peak="Pic de tassa de bits"
-AMF.H264.Bitrate.Peak.Description="Tassa de marcs a intentar aconseguir com pic màxim en la seqüència general."
-AMF.H264.QP.Minimum="QP mínim"
-AMF.H264.QP.Minimum.Description="Valor mínim de QP (paràmetre de quantització) a utilitzar en un fotograma."
-AMF.H264.QP.Maximum="QP màxim"
-AMF.H264.QP.Maximum.Description="Valor màxim de QP (paràmetre de quantització) a utilitzar en un fotograma."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Valor fix de QP per I-Frames."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Valor fix de QP per P-Frames."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Valor de QP fix (paràmetre de quantització) a utilitzar per B-Frames."
-AMF.H264.VBVBuffer="Memòria intermèdia VBV"
-AMF.H264.VBVBuffer.Description="Quin mètode s'ha d'utilitzar per determinar la mida de la memòria intermèdia VBV:\n- 'Automàtic' calcula la mida utilitzant una restricció estricta,\n- 'Manual' permet a l'usuari controlar la mida.\nLa memòria intermèdia VBV (Verificador de la memòria intermèdia del vídeo) és usat per certs mètodes de control del flux per mantenir la taxa de bits dins dels paràmetres establerts."
-AMF.H264.VBVBuffer.Strictness="Rigorositat de la memòria intermèdia VBV"
-AMF.H264.VBVBuffer.Strictness.Description="Determina la rigidesa de la memòria intermèdia VBV, con 100% essent tant estricte com sigui possible i 0% sense restricció."
-AMF.H264.VBVBuffer.Size="Mida de la memòria intermèdia VBV"
-AMF.H264.VBVBuffer.Size.Description="La mida de la memòria intermèdia VBV que s'utilitza per al control de la tassa de marcs en una seqüencia."
-AMF.H264.VBVBuffer.Fullness="Amplitud de la memòria intermèdia VBV"
-AMF.H264.VBVBuffer.Fullness.Description="Com de ple és la memòria intermèdia VMV inicialment, només afectarà la seqüència inicial de la codificació."
-AMF.H264.FillerData="Dades a omplir"
-AMF.H264.FillerData.Description="En activar les dades de farciment es permet al codificador mantenir almenys la tassa de marcs desitjada omplint l'espai que falta amb informació sense valor."
-AMF.H264.FrameSkipping="Omissió de fotogrames"
-AMF.H264.FrameSkipping.Description="L'omissió de fotogrames permet al codificador saltar fotogrames per complir amb el requeriment de la tassa de marcs objectiu.\nQuan el codificador salta un fotograma inserirà una NAL que repetirà el darrer fotograma codificat a la transmissió.\nPot ajudar amb tassa de bits objectiu molt baixes."
-AMF.H264.EnforceHRDCompatibility="Força la compatibilitat amb HRD"
-AMF.H264.EnforceHRDCompatibility.Description="Força les restriccions del descodificador hipotètic de referència que limiten el canvi de valor màxim de QP dins d'un fotograma."
-AMF.H264.KeyframeInterval="Interval de fotogrames clau"
-AMF.H264.KeyframeInterval.Description="Quants segons han d'haver entre fotogrames que no es poden descartar.\nTambé controla la mida de la seqüència (GOP)."
-AMF.H264.IDRPeriod="Període IDR"
-AMF.H264.IDRPeriod.Description="Defineix la distància entre Instantaneous Decoding Refreshes (IDR) en fotogrames. També controla la mida de la seqüència del GOP."
-AMF.H264.BFrame.Pattern="B-Frames"
-AMF.H264.BFrame.Pattern.Description="La quantitat de B-Frames a utilitzar mestre es codifica.\nCompatible amb targetes de 2ª i 3ª generació VCE. Impacte negatiu en el rendiment de codificació."
-AMF.H264.BFrame.DeltaQP="Delta QP per B-Frames"
-AMF.H264.BFrame.DeltaQP.Description="Valor Delta QP per al darrer I- o P-Frame per B-Frames no referenciables."
-AMF.H264.BFrame.Reference="B-Frames referenciables"
-AMF.H264.BFrame.Reference.Description="Permet a un B-Frames utilitzar també B-Frames com referència, enlloc de P- i I-Frames."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP per als fotogrames referenciables"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Valor Delta QP per al darrer I- o P-Frame per B-Frames referenciables."
-AMF.H264.DeblockingFilter="Filtre d'eliminació de blocs"
-AMF.H264.DeblockingFilter.Description="Estableix l'indicador que el descodificador està permès a utilitzar el Filtre d'eliminació de blocs per a la transmissió codificada."
-AMF.H264.ScanType="Tipus d'exploració"
-AMF.H264.ScanType.Description="Quin mètode de escaneig utilitzar, deixeu-lo sempre a '\@AMF.H264.ScanType.Progressive\@'."
-AMF.H264.ScanType.Progressive="Progressiu"
-AMF.H264.ScanType.Interlaced="Entrellaçat"
-AMF.H264.MotionEstimation="Estimació del moviment"
-AMF.H264.MotionEstimation.Description="L'estimació del moviment permet al codificador reduir el flux de dades necessari calculant d'on vénen els píxels."
-AMF.H264.MotionEstimation.None="Cap"
-AMF.H264.MotionEstimation.Half="Meitat de píxel"
-AMF.H264.MotionEstimation.Quarter="Quart de Píxel"
-AMF.H264.MotionEstimation.Both="Meitat i quart de píxel"
-AMF.H264.CodingType="Tipus de codificació"
-AMF.H264.CodingType.Description="Quin tipus de codificació utilitzar:\n* \@AMF.Util.Default\@ deixeu que AMF ho decideixi (recomanat).\n* CALVC (Context-Adaptive Variable-Length Coding) és més ràpid, però mes gran.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) es més lent, però més petit."
+Utility.Default="Per defecte"
+Utility.Automatic="Automàtic"
+Utility.Manual="Manual"
+Utility.Switch.Disabled="Desactivat"
+Utility.Switch.Enabled="Activat"
+Preset="Configuració preestablerta"
+Preset.ResetToDefaults="Restableix als valors per defecte"
+Preset.Recording="S'està enregistrant"
+Preset.HighQuality="Alta qualitat"
+Preset.Indistinguishable="Indistinguible"
+Preset.Lossless="Sense pèrdues"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Ús"
+Usage.Description="Quin ús s'ha d'ajustar AMF:\n- '\@Usage.Transcoding\@' és per a un ús generalitzat (recomanat),\n- '\@Usage.UltraLowLatency\@' és per a una codificació de molt baixa latència,\n- '\@Usage.LowLatency\@' és similar a l'anterior però amb una mica més de latència.\nLes emissions en directe només suporten '\@Usage.Transcoding\@', tots els valors poden ser usats per gravar."
+Usage.Transcoding="Transcodificació"
+Usage.UltraLowLatency="Latència ultra baixa"
+Usage.LowLatency="Latència baixa"
+Usage.Webcam="Càmera web"
+QualityPreset="Qualitat del perfil"
+QualityPreset.Description="Quina qualitat del perfil AMF s'ha d'intentar aconseguir:\n- '\@QualityPreset.Speed\@' és el més ràpid però té la pitjor qualitat, \n-'\@QualityPreset.Balanced\@' és una barreja equilibrada de tots dos,\n- '\@QualityPreset.Quality\@' dóna la millor qualitat per a un determinada taxa de bits."
+QualityPreset.Speed="Velocitat"
+QualityPreset.Balanced="Equilibrat"
+QualityPreset.Quality="Qualitat"
+Profile="Perfil"
+Profile.Description="Quin perfil s'ha d'utilitzar per a la codificació, ordenats des de el suport més estès a la més alta qualitat."
+ProfileLevel="Nivell de perfil"
+ProfileLevel.Description="Quin nivell de perfil s'ha d'utilitzar per a la codificació, el millor és deixar-ho això en \@Utility.Automatic\@"
+Tier="Nivell"
+Tier.Description="En quin nivell codificar, Main es la codificació normal, mentre que High s'enfoca en aplicacions d'alta velocitat de bits."
+AspectRatio="Relació d'aspecte"
+AspectRatio.Description="Quina relació d'aspecte s'ha d'escriure a l'arxiu de sortida."
+CodingType="Tipus de codificació"
+CodingType.Description="Quin tipus de codificació utilitzar:\n* '\@Utility.Automatic\@' deixeu que AMF ho decideixi (recomanat).\n* 'CALVC' (Context-Adaptive Variable-Length Coding) és més ràpid, però més gran.\n* 'CABAC' (Context-Adaptive Binary Arithmetic Coding) és més lent, però més petit."
+MaximumReferenceFrames="Fotogrames de referència màxims"
+MaximumReferenceFrames.Description="Quants quadres el codificador pot fer referència al màxim quan codifica, té un impacte directe en la qualitat de codificació."
+RateControlMethod="Mètode de control del flux"
+RateControlMethod.Description="Quin mètode de control de flux s'ha d'utilitzar:\n- '\@RateControlMethod.CQP\@' assigna valors fixos de QP a I-/P-/B-Frames,\n- '\@RateControlMethod.CBR\@' es manté en la taxa de bits objectiu (utilitzant dades de farciment) (recomanat per a emissions en directe),\n- '\@RateControlMethod.VBR\@' es manté per sota d'un pic de taxa de bits,\n- '\@RateControlMethod.VBRLAT\@' es manté prop de la taxa de bits desitjada si la latència i càrrega de la GPU ho permet, si no s'augmentarà la taxa de bits (recomanat per a enregistraments)."
+RateControlMethod.CQP="QP constant (CQP)"
+RateControlMethod.CBR="Flux constant (CBR)"
+RateControlMethod.VBR="Flux variable (pic restringit) (VBR)"
+RateControlMethod.VBRLAT="Flux variable (latència restringida) (VBR_LAT)"
+PrePassMode="Mode passada prèvia"
+PrePassMode.Description="La passada prèvia és una passada de distribució de taxa de bits secundària que permet una millor distribució de la taxa de bits dins d'una seqüència, però els efectes d'aquesta pot variar de targeta a targeta."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (1/4 de la mida)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (1/2 de la mida)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (mida completa)"
+Bitrate.Target="Tassa de bits desitjada"
+Bitrate.Target.Description="Tassa de bits a intentar arribar a la seqüència general."
+Bitrate.Peak="Pic de tassa de bits"
+Bitrate.Peak.Description="Tassa de bits a intentar aconseguir com a pic màxim en la seqüència general."
+QP.IFrame="I-Frame QP"
+QP.IFrame.Description="Valor fix de QP per I-Frames."
+QP.PFrame="P-Frame QP"
+QP.PFrame.Description="Valor fix de QP per P-Frames."
+QP.BFrame="B-Frame QP"
+QP.BFrame.Description="Valor fix de QP per B-Frames."
+QP.Minimum="QP mínim"
+QP.Minimum.Description="Valor QP mínim a utilitzar en un marc."
+QP.IFrame.Minimum="I-Frame QP mínim"
+QP.IFrame.Minimum.Description="Valor QP mínim a utilitzar en un I-Frame."
+QP.PFrame.Minimum="P-Frame QP mínim"
+QP.PFrame.Minimum.Description="Valor QP mínim a utilitzar en un P-Frame."
+QP.Maximum="QP màxim"
+QP.Maximum.Description="Valor QP màxim a utilitzar en un marc."
+QP.IFrame.Maximum="I-Frame QP màxim"
+QP.IFrame.Maximum.Description="Valor QP màxim a utilitzar en un I-Frame."
+QP.PFrame.Maximum="P-Frame QP màxim"
+QP.PFrame.Maximum.Description="Valor QP màxim a utilitzar en un P-Frame."
+FillerData="Dades de farciment"
+FillerData.Description="En activar les dades de farciment es permet al codificador mantenir almenys \@Bitrate.Target\@ omplint l'espai que falta amb informació sense valor."
+FrameSkipping="Omissió de fotogrames"
+FrameSkipping.Description="L'omissió de fotogrames permet al codificador saltar fotogrames per complir amb el requeriment de la\@Bitrate.Target\@.\nQuan el codificador salta un fotograma inserirà una NAL que repetirà l'últim fotograma codificat a la transmissió.\nPot ajudar amb \@Bitrate.Target\@ molt baixes."
+VBAQ="VBAQ"
+VBAQ.Description="En activar l'ús de 'Quantització adaptativa basada en la variació' (VBAQ) que es basa en la variació del pixel per a una millor distribució de la taxa de bits. \nFunciona amb la idea que el sistema visual humà és menys sensible als artefactes en àrees altament texturades i així mourà la tassa de bits cap a superfícies més suaus. \nEn activar-lo pot portar a millores en la qualitat subjectiva en cert contingut."
+EnforceHRD="Força l'HRD"
+EnforceHRD.Description="Força l'ús d'un descodificador de referència hipotètic que s'utilitza per verificar que el flux de marcs de sortida es correcte."
+VBVBuffer="Memòria intermèdia VBV"
+VBVBuffer.Description="Quin mètode s'ha d'utilitzar per determinar la mida de la memòria intermèdia VBV:\n- '\@Utlity.Automatic\@' calcula la mida utilitzant una restricció estricta,\n- '\@Utlity.Manual\@' permet a l'usuari controlar la mida.\nLa memòria intermèdia VBV (Verificador de la memòria intermèdia del vídeo) és usat per certs mètodes de control del flux per mantenir la taxa de bits dins dels paràmetres establerts."
+VBVBuffer.Strictness="Rigorositat de la memòria intermèdia VBV"
+VBVBuffer.Strictness.Description="Determina la rigidesa de la memòria intermèdia VBV, con 100% essent tant estricte com sigui possible i 0% sense restricció."
+VBVBuffer.Size="Mida de la memòria intermèdia VBV"
+VBVBuffer.Size.Description="La mida de la memòria intermèdia VBV que s'utilitza per al control de la tassa de marcs en una seqüencia."
+VBVBuffer.InitialFullness="Amplitud inicial de la memòria intermèdia VBV"
+VBVBuffer.InitialFullness.Description="Com de ple és la memòria intermèdia VMV inicialment (en %), només afectarà la seqüència inicial de la codificació."
+KeyframeInterval="Interval de fotogrames clau"
+KeyframeInterval.Description="Interval (en segons) entre fotogrames clau."
+H264.IDRPeriod="Període IDR (en marcs)"
+H264.IDRPeriod.Description="Defineix la distància entre Instantaneous Decoding Refreshes (IDR) en fotogrames. També controla la mida de la seqüència del GOP."
+H265.IDRPeriod="Període IDR (en GOPs)"
+H265.IDRPeriod.Description="Defineix la distància entre actualitzacions de descodificació instantània (IDR) en GOPs."
+GOP.Type="Tipus de GOP"
+GOP.Type.Description="Quin tipus de GOP s'ha d'utilitzar: \n - ' \@GOP. Type.Fixed\@' utilitzarà sempre distàncies fixes entre cada GOP. \n - ' \@GOP. Type.Variable\@' permet GOPs de diferents mides, depenent del que es necessiti.\n'\@GOP. Type.Fixed\@' és com els treballs d'implementació H264 i el millor per a les transmissions de xarxa local, mentre que ' \@GOP. Type.Variable\@' és el millor per a enregistraments d'alta qualitat i de baix grandària."
+GOP.Type.Fixed="Fix"
+GOP.Type.Variable="Variable"
+GOP.Size="Mida del GOP"
+GOP.Size.Description="Mida del GOP (grup d'imatges) en marcs."
+GOP.Size.Minimum="Mida mínima del GOP"
+GOP.Size.Minimum.Description="Mida mínima del GOP (grup d'imatges) en marcs."
+GOP.Size.Maximum="Mida màxima del GOP"
+GOP.Size.Maximum.Description="Mida màxima del GOP (grup d'imatges) en marcs."
+GOP.Alignment="Alineació del GOP"
+GOP.Alignment.Description="Experimental, els efectes són desconeguts. Utilitzeu-lo sota la vostra responsabilitat."
+BFrame.Pattern="Patró B-Frame"
+BFrame.Pattern.Description="La quantitat de B-Frames a utilitzar mestre es codifica.\nCompatible amb targetes de 2ª i 3ª generació VCE. Impacte negatiu en el rendiment de codificació."
+BFrame.DeltaQP="B-Frame Delta QP"
+BFrame.DeltaQP.Description="Valor Delta QP per al darrer I- o P-Frame per B-Frames no referenciables."
+BFrame.Reference="Referència B-Frame"
+BFrame.Reference.Description="Permet a un B-Frames utilitzar també B-Frames com referència, enlloc de P- i I-Frames."
+BFrame.ReferenceDeltaQP="Referència B-Frame Delta QP"
+BFrame.ReferenceDeltaQP.Description="Valor Delta QP per al darrer I- o P-Frame per B-Frames referenciables."
+DeblockingFilter="Filtre d'eliminació de blocs"
+DeblockingFilter.Description="Permet al descodificador aplicar un filtre d'eliminació de blocs."
+MotionEstimation="Estimació del moviment"
+MotionEstimation.Description="L'estimació del moviment permet al codificador reduir el flux de dades necessari calculant d'on vénen els píxels."
+MotionEstimation.Quarter="1/4 de píxel"
+MotionEstimation.Half="1/2 de píxel"
+MotionEstimation.Full="1/4 i 1/2 de píxel"
+Video.API="API de vídeo"
+Video.API.Description="Quina API hauria d'utilitzar el rerefons?"
+Video.Adapter="Adaptador de vídeo"
+Video.Adapter.Description="En quin adaptador hauríem d'intentar codificar?"
+OpenCL="OpenCL"
+OpenCL.Description="S'hauria de fer servir l'OpenCL per enviar marcs? Tècnicament és més ràpid, però causa problemes amb els controladors d'Intel (degut a biblioteques OpenCL incompatibles)."
+View="Mode de visualització"
+View.Description="Quines propietats s'han de mostrar?\nEn utilitzar '\@View.Master\@' us desqualifica per rebre suport."
+View.Basic="Bàsic"
+View.Advanced="Avançat"
+View.Expert="Expert"
+View.Master="Màster"
+Debug="Depuració"
+Debug.Description="Activa el missatges de depuració addicionals. Requereix que executeu l'Open Broadcaster Software Studio amb la línia d'ordres '--verbose --log_unfiltered' (elimineu les ')."
AMF.H264.MaximumLTRFrames="Fotogrames màxims LTR"
AMF.H264.MaximumLTRFrames.Description="Fotogrames de referència a llarg plaç (LTR) és una característica que permet al codificador marcar certs marcs en una seqüencia com referents per un llarg temps.\nEls fotogrames LTR no poden ser utilitzats amb B-Pictures i el codificador desactivarà B-Pictures si s'utilitza."
AMF.H264.MaximumAccessUnitSize="Mida màxima de la unitat d'accés"
AMF.H264.HeaderInsertionSpacing.Description="Quants fotogrames han d'haver entre capçaleres NAL."
AMF.H264.WaitForTask="Espera per la tasca"
AMF.H264.WaitForTask.Description="Desconegut, Experimental"
-AMF.H264.PreAnalysisPass="Passada d'anàlisi previ"
-AMF.H264.PreAnalysisPass.Description="Desconegut, Experimental"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Desconegut, Experimental"
-AMF.H264.GOPSize="Mida del GOP"
-AMF.H264.GOPSize.Description="Desconegut, Experimental"
-AMF.H264.GOPAlignment="Alineació del GOP"
-AMF.H264.GOPAlignment.Description="Desconegut, Experimental"
-AMF.H264.MaximumReferenceFrames="Fotogrames de referència màxims"
-AMF.H264.MaximumReferenceFrames.Description="Desconegut, Experimental"
AMF.H264.SlicesPerFrame="Porcions per fotograma"
AMF.H264.SlicesPerFrame.Description="Quantes porcions I-Frame han de ser emmagatzemats en cada fotograma?\nUn valor de 0 permet al codificador decidir sobre la marxa.\nLa codificació Intra-Refresh és usada per a una reproducció i exploració més fluïda."
AMF.H264.SliceMode="Mode de porcions"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Desconegut, Experimental"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Macroblocs intra-refresh per Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Quants macroblocs han de ser emmagatzemats en cada slot?\nUn valor de 0 desactiva aquesta funció.\nLa codificació Intra-Refresh és usada per a una reproducció i exploració més fluïda."
-AMF.H264.VideoAPI="API de vídeo"
-AMF.H264.VideoAPI.Description="Quina API utilitzar per la codificació."
-AMF.H264.VideoAdapter="Adaptador de vídeo"
-AMF.H264.VideoAdapter.Description="Quin adaptador utilitzar per la codificació."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="El codificador hauria d'utilitzar OpenCL per presentar els fotogrames individuals?"
-AMF.H264.View="Mode de visualització"
-AMF.H264.View.Description="Quines propietats han de ser visibles. No rebreu suport si feu servir el mode de vista 'Expert' o 'Màster'."
-AMF.H264.View.Basic="Bàsic"
-AMF.H264.View.Advanced="Avançat"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Màster"
-AMF.H264.Debug="Depuració"
-AMF.H264.Debug.Description="Habilita el registre d'informació de depuració addicional, ha de ser activat quan necessiteu ajuda amb aquest codificador."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/cs-CZ.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/cs-CZ.ini
Changed
-AMF.Util.Default="Výchozí"
-AMF.Util.Automatic="Automatické"
-AMF.Util.Manual="Manuální"
-AMF.Util.Toggle.Disabled="Zakázáno"
-AMF.Util.Toggle.Enabled="Povoleno"
-AMF.H264.Preset="Profil"
-AMF.H264.Preset.ResetToDefaults="Obnovit výchozí"
-AMF.H264.Preset.Recording="Nahrávání"
-AMF.H264.Preset.HighQuality="Vysoká kvalita"
-AMF.H264.Preset.Indistinguishable="Nerozeznatelné"
-AMF.H264.Preset.Lossless="Lossless"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Využití"
-AMF.H264.Usage.Transcoding="Kódování"
-AMF.H264.Usage.UltraLowLatency="Ultra nízká odezva"
-AMF.H264.Usage.LowLatency="Nízká odezva"
-AMF.H264.QualityPreset="Profil kvality"
-AMF.H264.QualityPreset.Speed="Rychlost"
-AMF.H264.QualityPreset.Balanced="Vyváženo"
-AMF.H264.QualityPreset.Quality="Kvalita"
-AMF.H264.Profile="Profil"
-AMF.H264.ProfileLevel="Úroveň profilu"
-AMF.H264.RateControlMethod="Metoda řízení"
-AMF.H264.RateControlMethod.CQP="Konstantní QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Konstantní bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Proměnný bitrate (vázán špičkou) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Proměnný bitrate (vázán odezvou) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Cílový bitrate"
-AMF.H264.Bitrate.Target.Description="Bitrate, kterého se máme snažit dosáhnout v celé sekvenci."
-AMF.H264.Bitrate.Peak="Špičkový bitrate"
-AMF.H264.Bitrate.Peak.Description="Bitrate, kterého se máme snažit nepřekročit v celé sekvenci."
-AMF.H264.QP.Minimum="Minimální QP"
-AMF.H264.QP.Maximum="Maximální QP"
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Pevná hodnota QP používaná pro I-Frames."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Pevná hodnota QP používaná pro P-Frames."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Pevná hodnota QP používaná pro B-Frames."
-AMF.H264.VBVBuffer="VBV Buffer"
-AMF.H264.FillerData="Filtrovat data"
-AMF.H264.FrameSkipping="Přeskakování snímků"
-AMF.H264.EnforceHRDCompatibility="Vynutit kompatibilitu s HRD"
-AMF.H264.KeyframeInterval="Interval klíčový snímků"
-AMF.H264.KeyframeInterval.Description="Kolik vteřin by mělo být mezi ne-zahazovatelnými snímky.\nTaké ovládá velikost sekvence(GOP)."
-AMF.H264.ScanType="Typ skenování"
-AMF.H264.ScanType.Description="Určuje použitou metodu skenování, vždy ponechejte na 'Progresivní'."
-AMF.H264.ScanType.Progressive="Progresivní"
-AMF.H264.ScanType.Interlaced="Prokládané"
-AMF.H264.MotionEstimation="Odhad pohybu"
-AMF.H264.MotionEstimation.Description="Odhad pohybu umožňuje enkodéru snížit požadovaný bitrate předpovídáním, kam se určitý pixel posunul."
-AMF.H264.MotionEstimation.None="Žádný"
-AMF.H264.MotionEstimation.Half="Polovina pixelu"
-AMF.H264.MotionEstimation.Quarter="Čtvrtina pixelu"
-AMF.H264.MotionEstimation.Both="Polovina & čtvrtina pixelu"
+Utility.Default="Výchozí"
+Utility.Automatic="Automatické"
+Utility.Manual="Manuální"
+Utility.Switch.Disabled="Zakázáno"
+Utility.Switch.Enabled="Povoleno"
+Preset="Profil"
+Preset.ResetToDefaults="Obnovit výchozí"
+Preset.Recording="Nahrávání"
+Preset.HighQuality="Vysoká kvalita"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Využití"
+Usage.UltraLowLatency="Ultra nízká odezva"
+Usage.Webcam="Webkamera"
+QualityPreset.Balanced="Vyvážený"
+QualityPreset.Quality="Kvalita"
+Profile="Profil"
+ProfileLevel="Úroveň profilu"
AMF.H264.MaximumLTRFrames="Maximální počet LTR snímků"
-AMF.H264.VideoAPI="Grafické rozhraní (API)"
-AMF.H264.VideoAPI.Description="Které rozhraní má být použito pro kódování."
-AMF.H264.VideoAdapter="Grafický adaptér"
-AMF.H264.VideoAdapter.Description="Adaptér, který má být použit pro kódování."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Má být použit OpenCL pro vkládání jednotlivých snímků?"
-AMF.H264.View="Režim zobrazení"
-AMF.H264.View.Description="Které možnosti mají být zobrazeny. Nezískáte žádnou pomoc při použití módu 'Expert' či 'Master'."
-AMF.H264.View.Basic="Základní"
-AMF.H264.View.Advanced="Pokročilý"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Master"
-AMF.H264.Debug="Ladění"
-AMF.H264.Debug.Description="Zapne rozšířené protokolování, mělo by být zapnuto, pokud pořebujete pomoci s tmto enkodérem."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/da-DK.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/da-DK.ini
Changed
-AMF.Util.Default="Standard"
-AMF.Util.Automatic="Automatisk"
-AMF.Util.Manual="Manuelt"
-AMF.Util.Toggle.Disabled="Deaktiveret"
-AMF.Util.Toggle.Enabled="Aktiveret"
-AMF.H264.Preset="Forudindstillinger"
-AMF.H264.Preset.ResetToDefaults="Nulstil til standarder"
-AMF.H264.Preset.Recording="Optagelse"
-AMF.H264.Preset.HighQuality="Højkvalitet"
-AMF.H264.Preset.Indistinguishable="Indistingverbar"
-AMF.H264.Preset.Lossless="Tabsfri"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Anvendelse"
-AMF.H264.Usage.Description="Hvilken anvendelse AMF bør indstilles til:\n- '\@AMF.H264.Usage.Transcoding\@' er omkodning til generelle formål (anbefalet),\n - '\@AMF.H264.Usage.UltraLowLatency\@' er til virkelig lav-forsinkelseskodning,\n- '\@AMF.H264.Usage.LowLatency\@' er tilsvarende ovennævnte med en lidt større forsinkelse.\nStreaming understøtter kun '\@AMF.H264.Usage.Transcoding\@', alle andre værdier kan benyttes til optagelse."
-AMF.H264.Usage.Transcoding="Transcoding"
-AMF.H264.Usage.UltraLowLatency="Ekstrem Lav Ventetid"
-AMF.H264.Usage.LowLatency="Lav Ventetid"
-AMF.H264.QualityPreset="Kvalitets profiler"
-AMF.H264.QualityPreset.Description="Hvilken Kvalitetsforudindstilling AMF bør forsøges målrettet imod:\n- '\@AMF. H264. QualityPreset.Speed\@' er den hurtigste, men har den ringeste kvalitet,\n- '\@AMF. H264. QualityPreset.Balanced\@' er en balanceret blanding af begge,\n- '\@AMF. H264. QualityPreset.Quality\@' giver den bedste kvalitet for en given bithastighed."
-AMF.H264.QualityPreset.Speed="Hastighed (Dårligste kvalitet)"
-AMF.H264.QualityPreset.Balanced="Balanceret (Mellem Kvalitet)"
-AMF.H264.QualityPreset.Quality="Kvalitet (Bedste Kvalitet)"
-AMF.H264.Profile="Profil"
-AMF.H264.Profile.Description="Hvilken H.264-profil at benytte til kodning, sorteret fra højeste kvalitet til mest udbredte understøttelse."
-AMF.H264.ProfileLevel="Profil Niveau"
-AMF.H264.ProfileLevel.Description="Hvilket H.264-profilniveau at benytte til kodning:\n- ' \@AMF. Util.Automatic\@' beregner det best profilniveau for en given billedhastighed og -størrelse,\n- '4.1' understøtter 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' understøtter 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' understøtter 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' understøtter 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' understøtter 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960 x 540 172 FPS"
-AMF.H264.RateControlMethod="Rate Control Method"
-AMF.H264.RateControlMethod.CQP="Constant QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Constant Bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Variable Bitrate (Peak Constrained) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Variable Bitrate (Latency Constrained) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Mål Bitrate"
-AMF.H264.Bitrate.Peak="Top Bitrate"
-AMF.H264.QP.Minimum="Minimum QP"
-AMF.H264.QP.Maximum="Maksimum QP"
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.FillerData="Filler Data"
-AMF.H264.FrameSkipping="Billede skip"
-AMF.H264.EnforceHRDCompatibility="Tving HRD Kompatibilitet"
-AMF.H264.DeblockingFilter="Deblocking Filter"
-AMF.H264.ScanType="Scan Type"
-AMF.H264.ScanType.Progressive="Progressive"
-AMF.H264.ScanType.Interlaced="Interlaced"
+Utility.Default="Standard"
+Utility.Automatic="Automatisk"
+Utility.Manual="Manuel"
+Utility.Switch.Disabled="Deaktiveret"
+Utility.Switch.Enabled="Aktiveret"
+Preset="Forudindstillinger"
+Preset.ResetToDefaults="Nulstil til standarder"
+Preset.Recording="Optagelse"
+Preset.HighQuality="Høj kvalitet"
+Preset.Indistinguishable="Ingen forskel"
+Preset.Lossless="Tabsfri"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Brug"
+Usage.Transcoding="Transcoding"
+Usage.UltraLowLatency="Ultra lav latenstid"
+Usage.LowLatency="Lav latenstid"
+Usage.Webcam="Webcam"
+QualityPreset="Kvalitets profiler"
+QualityPreset.Speed="Hastighed"
+QualityPreset.Balanced="Balanceret"
+QualityPreset.Quality="Kvalitet"
+Profile="Profil"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/de-DE.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/de-DE.ini
Changed
-AMF.Util.Default="Standard"
-AMF.Util.Automatic="Automatisch"
-AMF.Util.Manual="Manuell"
-AMF.Util.Toggle.Disabled="Deaktiviert"
-AMF.Util.Toggle.Enabled="Aktiviert"
-AMF.H264.Preset="Voreinstellungen"
-AMF.H264.Preset.ResetToDefaults="Standardeinstellungen wiederherstellen"
-AMF.H264.Preset.Recording="Aufnahme"
-AMF.H264.Preset.HighQuality="Hohe Qualität"
-AMF.H264.Preset.Indistinguishable="Ununterscheidbar"
-AMF.H264.Preset.Lossless="Verlustfrei"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Nutzungsart"
-AMF.H264.Usage.Description="Auf welche Nutzungsart AMF eingestellt werden soll:\n- '\@AMF.H264.Usage.Transcoding\@' für generelle Codierung (empfohlen),\n- '\@AMF.H264.Usage.UltraLowLatency\@' für Codierung mit sehr niedriger Latenz,\n- '\@AMF.H264.Usage.LowLatency\@' ist ähnlich dem oberen mit leicht erhöhter Latenz.\nNur '\@AMF.H264.Usage.Transcoding\@' wird für das Streamen unterstützt, alle anderen können für Aufnahmen verwendet werden."
-AMF.H264.Usage.Transcoding="Umwandeln"
-AMF.H264.Usage.UltraLowLatency="Sehr Niedrige Latenz"
-AMF.H264.Usage.LowLatency="Niedrige Latenz"
-AMF.H264.QualityPreset="Qualitätsvoreinstellung"
-AMF.H264.QualityPreset.Description="Welche Qualitätsvoreinstellungen AMF versuchen soll zu verwenden:\n- '\@AMF.H264.QualityPreset.Speed\@' ist die schnellste aber hat die schlechteste Qualität,\n- '\@AMF.H264.QualityPreset.Balanced\@' ist ein balancierter Mix aus beiden,\n- '\@AMF.H264.QualityPreset.Quality\@' gibt die beste Qualität für eine gegebene Bitrate."
-AMF.H264.QualityPreset.Speed="Geschwindigkeit"
-AMF.H264.QualityPreset.Balanced="Ausgeglichen"
-AMF.H264.QualityPreset.Quality="Qualität"
-AMF.H264.Profile="Profil"
-AMF.H264.Profile.Description="Welches H.264 Profil soll für die Kodierung verwendet werden, sortiert von höchster Qualität bis am weitesten verbeiteten Unterstützung."
-AMF.H264.ProfileLevel="Profillevel"
-AMF.H264.ProfileLevel.Description="Welches H.264 Profil Level für das verarbeiten verwendet werden soll:\n- '\@AMF.Util.Automatic\@' errechnet das beste Profil Level für die gegebene Frame Rate und Frame Größe,\n- '4.1' unterstützt 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' unterstützt 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' unterstützt 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' unterstützt 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' unterstützt 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Ratenkontrollmethode"
-AMF.H264.RateControlMethod.Description="Welche Ratenkontrollmethode verwendet werden soll:\n- '\@AMF.H264.RateControlMethod.CQP\@' setzt feste QP (Quantifizierungsparameter) Werte für I-/P-/B- Bilder,\n- '\@AMF.H264.RateControlMethod.CBR\@' bleibt auf der Zielbitrate (mithilfe von Füllungsdaten) (empfohlen für das Streamen),\n- '\@AMF.H264.RateControlMethod.VBR\@' bleibt unter der Spitzenbitrate,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' bleibt nahe der Zielbitrate sofern GPU-Latenz und GPU-Benutzung dies erlauben, ansonsten wird eine höhere Bitrate verwendet (empfohlen für Aufnahmen)."
-AMF.H264.RateControlMethod.CQP="Konstante QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Konstante Bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Variable Bitrate (Eingeschränkt via Spitzenbitrate) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Variable Bitrate (Eingeschränkt via Latenz) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Zielbitrate"
-AMF.H264.Bitrate.Target.Description="Zielbitrate die in der ganzen Sequenz erreicht werden soll."
-AMF.H264.Bitrate.Peak="Spitzenbitrate"
-AMF.H264.Bitrate.Peak.Description="Spitzenbitrate die in der ganzen Sequenz erreicht werden darf."
-AMF.H264.QP.Minimum="Minimale QP"
-AMF.H264.QP.Minimum.Description="Niedrigster QP (Quantifizierungsparameter) wert der in einem Bild verwendet wird."
-AMF.H264.QP.Maximum="Maximale QP"
-AMF.H264.QP.Maximum.Description="Höchster QP (Quantifizierungsparameter) wert der in einem Bild verwendet wird."
-AMF.H264.QP.IFrame="I-Einzelbild QP"
-AMF.H264.QP.IFrame.Description="Fester QP wert der für I-Bilder verwendet wird."
-AMF.H264.QP.PFrame="P-Einzelbild QP"
-AMF.H264.QP.PFrame.Description="Fester QP wert der für P-Bilder verwendet wird."
-AMF.H264.QP.BFrame="B-Einzelbild QP"
-AMF.H264.QP.BFrame.Description="Fester QP wert der für B-Bilder verwendet wird."
-AMF.H264.VBVBuffer="VBV Buffer"
-AMF.H264.VBVBuffer.Description="Welche Methode genutzt werden soll um die VBV-Puffergröße zu bestimmen:\n- '\@AMF.Util.Automatic\@' errechnet diese mithilfe einer Strengeneinschränkung,\n- '\@AMF.Util.Manual\@' erlaubt es dem Nutzer die Größe zu kontrollieren.\n
-VBV (Videopufferungsverifizierer) Puffer wird von verschiedenen Ratenkontrollmethoden genutzt um die gesamte Bitrate innerhalb der Begrenzungen zu halten."
-AMF.H264.VBVBuffer.Strictness="VBV Buffer Genauigkeit"
-AMF.H264.VBVBuffer.Strictness.Description="Legt die Einschränkungsstärke des VBV Buffers fest, wobei 100% so eingeschränkt wie möglich ist und 0% komplett uneingeschränkt."
-AMF.H264.VBVBuffer.Size="VBV Buffer Größe"
-AMF.H264.VBVBuffer.Size.Description="Die Größe des VBV Buffers, welcher für die Bitratenkontrolle in einer Sequenz verwendet wird."
-AMF.H264.VBVBuffer.Fullness="VBV Buffer Füllung"
-AMF.H264.VBVBuffer.Fullness.Description="Wie voll der VBV Buffer anfangs ist, hat nur einen Effekt auf die erste Sequenz beim codieren."
-AMF.H264.FillerData="Füllungsdaten"
-AMF.H264.FillerData.Description="Das aktivieren von Füllungsdaten erlaubt es dem Codierer mindestens die Zielbitrate zu erreichen in dem dieser den verbleibenden Platz mit leeren Informationen füllt."
-AMF.H264.FrameSkipping="Bildüberspringung"
-AMF.H264.FrameSkipping.Description="Bildüberspringung erlaubt es dem Codierer, Bilder zu überspringen um die Zielbitrate einzuhalten.\nWenn der Codierer ein Bild überspringt, fügt dieser stattdessen ein repeat-last-frame NAL in den Stream ein.\nKann bei sehr niedrigen Zielbitraten helfen."
-AMF.H264.EnforceHRDCompatibility="Erzwinge HRD Kompatiblität"
-AMF.H264.EnforceHRDCompatibility.Description="Erzwinge Hypothetischer-Referenz-Decodierer Limitierungen welches die maximale Änderung des QP werts innerhalb eines Bildes begrenzt."
-AMF.H264.KeyframeInterval="Keyframeintervall"
-AMF.H264.KeyframeInterval.Description="Definiert die Distanz zwischen Keyframes in Sekunden. Setzt auch die Größe einer GOP-Sequenz fest."
-AMF.H264.IDRPeriod="IDR Zeitraum"
-AMF.H264.IDRPeriod.Description="Definiert die Distanz zwischen Sofortigen-Decodierer-Aktualisierungen (IDR) in Frames. Setzt auch die Größe einer GOP-Sequenz fest."
-AMF.H264.BFrame.Pattern="B-Bilder"
-AMF.H264.BFrame.Pattern.Description="Wie viele B-Bilder beim codieren verwendet werden sollen.\nWird von der 2ten und 3ten generation an VCE-Karten unterstützt. Negativer Einfluss auf Codierungsperformanz."
-AMF.H264.BFrame.DeltaQP="Delta QP für B-Bilder"
-AMF.H264.BFrame.DeltaQP.Description="Delta QP wert zum letzten I- or P-Bild für nicht referenzierbare B-Bilder."
-AMF.H264.BFrame.Reference="Referenzierbare B-Bilder"
-AMF.H264.BFrame.Reference.Description="Erlaube einem B-Bild andere B-Bilder als referenz zu verwenden, anstatt nur P- und I-Bilder."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP für referenzierbare B-Bilder"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Delta QP wert zum letzten I- or P-Bild für referenzierbare B-Bilder."
-AMF.H264.DeblockingFilter="Entblockungsfilter"
-AMF.H264.DeblockingFilter.Description="Setze die Markierung dass der Decodierer einen Entblockungsfilter verwenden darf."
-AMF.H264.ScanType="Abtastverfahren"
-AMF.H264.ScanType.Description="Welches Abtastverfahren verwendet werden soll. Sollte immer '\@AMF.H264.ScanType.Progressive\@' sein."
-AMF.H264.ScanType.Progressive="Progressiv"
-AMF.H264.ScanType.Interlaced="Zeilensprung"
-AMF.H264.MotionEstimation="Bewegungsschätzung"
-AMF.H264.MotionEstimation.Description="Bewegungsschätzung erlaubt des dem Codierer die benötigte Bitrate zu reduzieren durch das herausfinden, wo ein Pixel hinbewegt wurde."
-AMF.H264.MotionEstimation.None="Keine"
-AMF.H264.MotionEstimation.Half="Halb-Pixel"
-AMF.H264.MotionEstimation.Quarter="Viertel-Pixel"
-AMF.H264.MotionEstimation.Both="Halb- & Viertel-Pixel"
-AMF.H264.CodingType="Codierungstyp"
-AMF.H264.CodingType.Description="Welcher Codierungstyp verwendet werden soll:\n* \@AMF.Util.Default\@ lässt AMF entscheiden (empfohlen).\n* CALVC (Context-Adaptive Variable-Length Coding) ist schneller aber größer.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) ist langsamer aber kleiner."
+Utility.Default="Standard"
+Utility.Automatic="Automatisch"
+Utility.Manual="Manuell"
+Utility.Switch.Disabled="Deaktiviert"
+Utility.Switch.Enabled="Aktiviert"
+Preset="Voreinstellung"
+Preset.ResetToDefaults="Standardeinstellungen wiederherstellen"
+Preset.Recording="Aufnahme"
+Preset.HighQuality="Hohe Qualität"
+Preset.Indistinguishable="Ununterscheidbar"
+Preset.Lossless="Verlustfrei"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Nutzungsart"
+Usage.Description="Auf welche Nutzungsart AMF eingestellt werden soll:\n- '\@Usage.Transcoding\@' für generelle Kodierung (empfohlen),\n- '\@Usage.UltraLowLatency\@' für Kodierung mit sehr niedriger Latenz,\n- '\@Usage.LowLatency\@' ist ähnlich dem oberen mit leicht erhöhter Latenz.\nNur '\@Usage.Transcoding\@' wird für das Streamen unterstützt, alle anderen können für Aufnahmen verwendet werden."
+Usage.Transcoding="Umwandeln"
+Usage.UltraLowLatency="Sehr Niedrige Latenz"
+Usage.LowLatency="Niedrige Latenz"
+Usage.Webcam="Webcam"
+QualityPreset="Qualitätsvoreinstellung"
+QualityPreset.Description="Welche Qualitätsvoreinstellungen AMF versuchen soll zu verwenden:\n- '\@QualityPreset.Speed\@' ist die schnellste aber hat die schlechteste Qualität,\n- '\@QualityPreset.Balanced\@' ist ein balancierter mix aus beiden,\n- '\@QualityPreset.Quality\@' gibt die beste Qualität für eine gegebene Bitrate."
+QualityPreset.Speed="Geschwindigkeit"
+QualityPreset.Balanced="Ausgeglichen"
+QualityPreset.Quality="Qualität"
+Profile="Profil"
+Profile.Description="Welches Profil für das Kodieren verwendet wird. Sortiert von am besten unterstützt (oben) bis zu bester Qualität (unten)."
+ProfileLevel="Profillevel"
+ProfileLevel.Description="Welches Profillevel für das Kodieren verwendet wird. Es ist am besten dies auf \@Utility.Automatic\@ zu lassen."
+Tier="Stufe"
+Tier.Description="Welche Stufe beim kodieren verwendet wird. 'High' ist für hohe Bitraten/Bandbreite gedacht während 'Main' eher für Mainstream Medien gedacht ist."
+AspectRatio="Seitenverhältnis"
+AspectRatio.Description="Welches Seitenverhältnis in die kodierte Ausgabe geschrieben werden soll."
+CodingType="Kodierungstyp"
+CodingType.Description="Welcher Kodierungstyp verwendet werden soll:\n* \@Utility.Automatic\@ lässt AMF entscheiden (empfohlen).\n* 'CALVC' (Context-Adaptive Variable-Length Coding) ist schneller aber größer.\n* 'CABAC' (Context-Adaptive Binary Arithmetic Coding) ist langsamer aber kleiner."
+MaximumReferenceFrames="Maximale Referenzbilder"
+MaximumReferenceFrames.Description="Wie viele Bilder der Kodierer beim kodieren maximal referenzieren darf. Hat einen direkten Einfluss auf Kodierungsqualität."
+RateControlMethod="Ratenkontrollmethode"
+RateControlMethod.Description="Welche Ratenkontrollmethode verwendet werden soll:\n- '\@RateControlMethod.CQP\@' setzt feste QP Werte für I-/P-/B- Bilder,\n- '\@RateControlMethod.CBR\@' bleibt auf der Zielbitrate (mithilfe von Füllungsdaten) (empfohlen für das Streamen),\n- '\@RateControlMethod.VBR\@' bleibt unter der Spitzenbitrate,\n- '\@RateControlMethod.VBRLAT\@' bleibt nahe der Zielbitrate sofern GPU-Latenz und GPU-Benutzung dies erlauben, ansonsten wird eine höhere Bitrate verwendet (empfohlen für Aufnahmen)."
+RateControlMethod.CQP="Konstanter QP (CQP)"
+RateControlMethod.CBR="Konstante Bitrate (CBR)"
+RateControlMethod.VBR="Variable Bitrate (Eingeschränkt via Spitzenbitrate) (VBR)"
+RateControlMethod.VBRLAT="Variable Bitrate (Eingeschränkt via Latenz) (VBRLAT)"
+PrePassMode="Vordurchlaufsmodus"
+PrePassMode.Description="Vordurchlauf ist ein zweiter Bitratendistributionsdurchlauf welcher eine besser Verteilung der Bitrate innerhalb eine Sequenz erlaubt, aber die Effekte hiervon können von Karte zu Karte anders sein."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (Viertel der Größe)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (Hälfte der Größe)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (Volle Größe)"
+Bitrate.Target="Zielbitrate"
+Bitrate.Target.Description="Zielbitrate die in der ganzen Sequenz erreicht werden soll."
+Bitrate.Peak="Spitzenbitrate"
+Bitrate.Peak.Description="Spitzenbitrate die in der ganzen Sequenz erreicht werden darf."
+QP.IFrame="I-Bild QP"
+QP.IFrame.Description="Fester QP wert der für I-Bilder verwendet wird."
+QP.PFrame="P-Bild QP"
+QP.PFrame.Description="Fester QP wert der für P-Bilder verwendet wird."
+QP.BFrame="B-Bild QP"
+QP.BFrame.Description="Fester QP wert der für B-Bilder verwendet wird."
+QP.Minimum="Minimaler QP"
+QP.Minimum.Description="Niedrigster QP wert der in einem Bild verwendet wird."
+QP.IFrame.Minimum="Minimaler I-Bild QP"
+QP.IFrame.Minimum.Description="Niedrigster QP wert der in einem I-Bild verwendet wird."
+QP.PFrame.Minimum="Minimaler P-Bild QP"
+QP.PFrame.Minimum.Description="Niedrigster QP wert der in einem P-Bild verwendet wird."
+QP.Maximum="Maximaler QP"
+QP.Maximum.Description="Höchster QP wert der in einem Bild verwendet wird."
+QP.IFrame.Maximum="Maximaler I-Bild QP"
+QP.IFrame.Maximum.Description="Höchster QP wert der in einem I-Bild verwendet wird."
+QP.PFrame.Maximum="Maximaler P-Bild QP"
+QP.PFrame.Maximum.Description="Höchster QP wert der in einem P-Bild verwendet wird."
+FillerData="Füllungsdaten"
+FillerData.Description="Das aktivieren von Füllungsdaten erlaubt es dem Kodierer mindestens die \@Bitrate.Target\@ zu erreichen in dem dieser den verbleibenden Platz mit leeren Informationen füllt."
+FrameSkipping="Bildüberspringung"
+FrameSkipping.Description="Bildüberspringung erlaubt es dem Kodierer, Bilder zu überspringen um die \@Bitrate.Target\@ einzuhalten.\nWenn der Kodierer ein Bild überspringt, fügt dieser stattdessen ein repeat-last-frame NAL in den Stream ein.\nKann bei sehr niedriger \@Bitrate.Target\@ helfen."
+VBAQ="VBAQ"
+VBAQ.Description="Aktivieren der Verwendung der \"Auf Varianz basierte Adaptive Quantisierung\" (VBAQ) welche auf Pixel Veränderung basiert ist, um die Bitrate besser zu verteilen. \nEs basiert auf der Idee, dass das menschliche Auge weniger anfällig für Artefakte in stark strukturierte Flächen ist und erhöt so die Bitrate bei glatten Oberflächen. \nAktivierung kann zu Verbesserungen der subjektiven Qualität mit bestimmten Inhalten führen."
+EnforceHRD="Erzwinge HRD"
+EnforceHRD.Description="Erzwinge die Nutzung eines Hypothetischen Referenz Dekodierers welcher genutzt wird um den ausgehenden Datenstrom zu verifizieren."
+VBVBuffer="VBV Buffer"
+VBVBuffer.Description="Welche Methode genutzt werden soll um die VBV-Puffergröße zu bestimmen:\n- '\@Utility.Automatic\@' errechnet diese mithilfe einer strengen Einschränkung,\n- '\@Utilility.Manual\@' erlaubt es dem Nutzer die Größe zu kontrollieren.\nVBV (Videopufferungsverifizierer) Puffer wird von verschiedenen Ratenkontrollmethoden genutzt um die gesamte Bitrate innerhalb der Begrenzungen zu halten."
+VBVBuffer.Strictness="VBV Buffer Strenge"
+VBVBuffer.Strictness.Description="Legt die Einschränkungsstärke des VBV Buffers fest, wobei 100% so eingeschränkt wie möglich ist und 0% komplett uneingeschränkt."
+VBVBuffer.Size="VBV Buffer Größe"
+VBVBuffer.Size.Description="Die Größe des VBV Buffers, welcher für die Bitratenkontrolle in einer Sequenz verwendet wird."
+VBVBuffer.InitialFullness="VBV Buffer Anfängliche Füllung"
+VBVBuffer.InitialFullness.Description="Wie voll der VBV Buffer am Anfang ist (in %), hat nur einen Effekt auf die erste Sequenz beim kodieren."
+KeyframeInterval="Schlüsselbildintervall"
+KeyframeInterval.Description="Intervall (in Sekunden) zwischen Schlüsselbildern."
+H264.IDRPeriod="IDR Intervall (in Bildern)"
+H264.IDRPeriod.Description="Definiert die Distanz zwischen Sofortigen-Dekodierer-Aktualisierungen (IDR) in Frames. Setzt auch die Größe einer GOP-Sequenz fest."
+H265.IDRPeriod="IDR Intervall (in GOPs)"
+H265.IDRPeriod.Description="Definiert die Distanz zwischen Sofortigen-Dekodierer-Aktualisierungen (IDR) in GOPs."
+GOP.Type="GOP Typ"
+GOP.Type.Description="Welcher Typ an GOP verwendet werden soll:\n- '\@GOP.Type.Fixed\@' wird immer eine feste Distanz zwischen GOPs haben.\n- '\@GOP.Type.Variable\@' erlaubt GOPs mit variabler Größe, je nachdem was gebraucht wird.\n'\@GOP.Type.Fixed\@' ist was die H264 Implementierung verwendet und funktioniert am besten für lokales streamen, während '\@GOP.Type.Variable\@ am besten ist für Aufnahmen mit hoher Qualität und kleiner Größe."
+GOP.Type.Fixed="Fest"
+GOP.Type.Variable="Variabel"
+GOP.Size="GOP Größe"
+GOP.Size.Description="Größe eines GOP (Group Of Pictures / Gruppe Von Bildern) in Bildern."
+GOP.Size.Minimum="Minimale GOP Größe"
+GOP.Size.Minimum.Description="Minimale Größe eines GOP (Group Of Pictures / Gruppe Von Bildern) in Bildern."
+GOP.Size.Maximum="Maximale GOP Größe"
+GOP.Size.Maximum.Description="Maximale Größe eines GOP (Group Of Pictures / Gruppe Von Bildern) in Bildern."
+GOP.Alignment="GOP Ausrichtung"
+GOP.Alignment.Description="Experimentell, Effekte sind unbekannt. Benutzung auf eigene Gefahr."
+BFrame.Pattern="B-Bilder Struktur"
+BFrame.Pattern.Description="Wie viele B-Bilder beim kodieren verwendet werden sollen.\nWird von der 2ten und 3ten generation an VCE-Karten unterstützt. Negativer Einfluss auf Kodierungsperformanz."
+BFrame.DeltaQP="B-Bild Delta QP"
+BFrame.DeltaQP.Description="Delta QP wert zum letzten I- or P-Bild für nicht referenzierbare B-Bilder."
+BFrame.Reference="B-Bild Referenz"
+BFrame.Reference.Description="Erlaube einem B-Bild andere B-Bilder als referenz zu verwenden, anstatt nur P- und I-Bilder."
+BFrame.ReferenceDeltaQP="B-Bild Referenz Delta QP"
+BFrame.ReferenceDeltaQP.Description="Delta QP wert zum letzten I- or P-Bild für referenzierbare B-Bilder."
+DeblockingFilter="Entblockungsfilter"
+DeblockingFilter.Description="Erlaube dem Dekodierer einen Entblockunsfilter zu verwenden."
+MotionEstimation="Bewegungsschätzung"
+MotionEstimation.Description="Bewegungsschätzung erlaubt des dem Kodierer die benötigte Bitrate zu reduzieren indem dieser versucht zu erkennen wo sich ein Pixel hinbewegt hat."
+MotionEstimation.Quarter="Viertel-Pixel"
+MotionEstimation.Half="Halb-Pixel"
+MotionEstimation.Full="Viertel- & Halb-Pixel"
+Video.API="Video API"
+Video.API.Description="Welche API soll das Backend verwenden?"
+Video.Adapter="Video Adapter"
+Video.Adapter.Description="Auf welchem Adapter soll versucht werden zu Kodieren?"
+OpenCL="OpenCL"
+OpenCL.Description="Soll OpenCL für das übertragen von Bildern verwendet werden? Technisch schneller, aber verursacht Probleme mit Intel Treiben (auf Grund von inkompatiblen OpenCL Bibliotheken)."
+View="Ansichtsmodus"
+View.Description="Welche Eigenschaften sollen sichtbar sein?\nDas benutzen von '\@View.Master\@' disqualifiziert dich von jeglichen Support."
+View.Basic="Grundlegend"
+View.Advanced="Erweitert"
+View.Expert="Experte"
+View.Master="Meister"
+Debug="Debug"
+Debug.Description="Aktiviere zusätzlich Fehlerprotokoll nachrichten. Benötigt das Open Broadcaster Software Studio mit der Kommandozeile '--verbose --log_unfiltered' (ohne ') gestartet wurde."
AMF.H264.MaximumLTRFrames="Maximale Langzeitreferenz-Bilder"
AMF.H264.MaximumLTRFrames.Description="Langzeitreferenz (LTR) Bilder ist ein Feature das dem Codierer erlaubt, einige Bilder innerhalb einer Sequenz als referenzierbar zu markieren.\nLTR Bilder können nicht zusammen mit B-Bildern verwendet werden und der Codierer wird B-Bilder deaktivieren sofern diese verwendet werden."
AMF.H264.MaximumAccessUnitSize="Maximale Zugriffseinheitsgröße"
AMF.H264.HeaderInsertionSpacing.Description="Wie viele Bilder zwischen NAL-Kopfzeilen sein sollen."
AMF.H264.WaitForTask="Warte auf Arbeit"
AMF.H264.WaitForTask.Description="Unbekannt, Experimentell"
-AMF.H264.PreAnalysisPass="Voranalyse Durchlauf"
-AMF.H264.PreAnalysisPass.Description="Unbekannt, Experimentell"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Unbekannt, Experimentell"
-AMF.H264.GOPSize="GOP Größe"
-AMF.H264.GOPSize.Description="Unbekannt, Experimentell"
-AMF.H264.GOPAlignment="GOP Angleichung"
-AMF.H264.GOPAlignment.Description="Unbekannt, Experimentell"
-AMF.H264.MaximumReferenceFrames="Maximale Referenzbilder"
-AMF.H264.MaximumReferenceFrames.Description="Unbekannt, Experimentell"
AMF.H264.SlicesPerFrame="Slices pro Frame"
AMF.H264.SlicesPerFrame.Description="Wie viele I-Bild-Schnitte sollen mit jedem Bild gespeichert werden?\nEin Wert von Null erlaubt es dem Codierer während dem codieren die Entscheidung zu treffen.\nIntra-Erneuerungs-Codieren wird genutzt für schnelleres abspielen und suchen."
AMF.H264.SliceMode="Schnittmodus"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Unbekannt, Experimentell"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Intra-Refresh Makroblöcke pro Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Wie viele Makroblöcke sollen pro Slot gespeichert werden?\nEin Wert von Null schaltet diese Feature ab.\nIntra-Erneuerungs-Codieren wird genutzt für schnelleres abspielen und suchen."
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAPI.Description="Welche API für das codieren verwendet werden soll."
-AMF.H264.VideoAdapter="Video Adapter"
-AMF.H264.VideoAdapter.Description="Welcher Adapter für das codieren verwendet werden soll."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Soll der Codierer OpenCL zum übertragen der Bilder verwenden?"
-AMF.H264.View="Ansichtsmodus"
-AMF.H264.View.Description="Welche Eigenschaften sollen sichtbar sein?\nDas benutzen von '\@AMF.H264.View.Master\@' disqualifiziert dich von jeglichen Support."
-AMF.H264.View.Basic="Grundlegend"
-AMF.H264.View.Advanced="Erweitert"
-AMF.H264.View.Expert="Experte"
-AMF.H264.View.Master="Meister"
-AMF.H264.Debug="Debug"
-AMF.H264.Debug.Description="Aktiviere erweiterte Debug-Nachrichten, sollte aktiv sein wenn man Hilfe erwartet mit diesem Codierer."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/en-US.ini
Changed
# Shared
-AMF.Util.Default="Default"
-AMF.Util.Automatic="Automatic"
-AMF.Util.Manual="Manual"
-AMF.Util.Toggle.Disabled="Disabled"
-AMF.Util.Toggle.Enabled="Enabled"
-# H264
-# - Preset
-AMF.H264.Preset="Preset"
-AMF.H264.Preset.ResetToDefaults="Reset to Defaults"
-AMF.H264.Preset.Recording="Recording"
-AMF.H264.Preset.HighQuality="High Quality"
-AMF.H264.Preset.Indistinguishable="Indistinguishable"
-AMF.H264.Preset.Lossless="Lossless"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-# - Startup Properties
-AMF.H264.Usage="Usage"
-AMF.H264.Usage.Description="What usage AMF should be tuned for:\n- '\@AMF.H264.Usage.Transcoding\@' is general purpose transcoding (recommended),\n- '\@AMF.H264.Usage.UltraLowLatency\@' is for really low latency encoding,\n- '\@AMF.H264.Usage.LowLatency\@' is similar to above with a slightly higher latency.\nStreaming only supports '\@AMF.H264.Usage.Transcoding\@', all other values can be used for recording."
-AMF.H264.Usage.Transcoding="Transcoding"
-AMF.H264.Usage.UltraLowLatency="Ultra Low Latency"
-AMF.H264.Usage.LowLatency="Low Latency"
-AMF.H264.QualityPreset="Quality Preset"
-AMF.H264.QualityPreset.Description="What Quality Preset AMF should attempt to target:\n- '\@AMF.H264.QualityPreset.Speed\@' is the fastest but has the worst quality,\n- '\@AMF.H264.QualityPreset.Balanced\@' is a balanced mix of both,\n- '\@AMF.H264.QualityPreset.Quality\@' gives the best quality for a given bitrate."
-AMF.H264.QualityPreset.Speed="Speed"
-AMF.H264.QualityPreset.Balanced="Balanced"
-AMF.H264.QualityPreset.Quality="Quality"
-AMF.H264.Profile="Profile"
-AMF.H264.Profile.Description="Which H.264 Profile to use for encoding, sorted from highest quality to most widespread support."
-AMF.H264.ProfileLevel="Profile Level"
-AMF.H264.ProfileLevel.Description="Which H.264 Profile Level to use for encoding:\n- '\@AMF.Util.Automatic\@' calculates the best profile level for the given Frame Rate and Frame Size,\n- '4.1' supports 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' supports 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' supports 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' supports 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' supports 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-# - Rate Control Properties
-AMF.H264.RateControlMethod="Rate Control Method"
-AMF.H264.RateControlMethod.Description="What rate control method should be used:\n- '\@AMF.H264.RateControlMethod.CQP\@' assigns fixed I-/P-/B-Frame QP values,\n- '\@AMF.H264.RateControlMethod.CBR\@' stays at the given Target Bitrate (using Filler Data) (recommended for streaming),\n- '\@AMF.H264.RateControlMethod.VBR\@' stays below the given Peak Bitrate,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' stays close to the Target Bitrate if GPU latency and load allow for it, otherwise will use higher bitrate (recommended for recording)."
-AMF.H264.RateControlMethod.CQP="Constant QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Constant Bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Variable Bitrate (Peak Constrained) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Variable Bitrate (Latency Constrained) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Target Bitrate"
-AMF.H264.Bitrate.Target.Description="Bitrate to attempt to achieve in the overall sequence."
-AMF.H264.Bitrate.Peak="Peak Bitrate"
-AMF.H264.Bitrate.Peak.Description="Bitrate to attempt to maximally peak to in the overall sequence."
-AMF.H264.QP.Minimum="Minimum QP"
-AMF.H264.QP.Minimum.Description="Lowest QP value to use in a Frame."
-AMF.H264.QP.Maximum="Maximum QP"
-AMF.H264.QP.Maximum.Description="Highest QP value to use in a Frame."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Fixed QP value to use for I-Frames."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Fixed QP value to use for P-Frames."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Fixed QP value to use for B-Frames."
-AMF.H264.VBVBuffer="VBV Buffer"
-AMF.H264.VBVBuffer.Description="What method should be used to determine the VBV Buffer Size:\n- '\@AMF.Util.Automatic\@' calculates the size using a strictness constraint,\n- '\@AMF.Util.Manual\@' allows the user to control the size.\nVBV (Video Buffering Verifier) Buffer is used by certain Rate Control Methods to keep the overall bitrate within the given constraints."
-AMF.H264.VBVBuffer.Strictness="VBV Buffer Strictness"
-AMF.H264.VBVBuffer.Strictness.Description="Determines the strictness of the VBV Buffer, with 100% being as strict as possible and 0% being unrestricted."
-AMF.H264.VBVBuffer.Size="VBV Buffer Size"
-AMF.H264.VBVBuffer.Size.Description="The size of the VBV Buffer which is used for Bitrate control in a sequence."
-AMF.H264.VBVBuffer.Fullness="VBV Buffer Fullness"
-AMF.H264.VBVBuffer.Fullness.Description="How full the VBV Buffer initially is, will only affect the initial sequence of encoding."
-AMF.H264.FillerData="Filler Data"
-AMF.H264.FillerData.Description="Enabling Filler Data allows the encoder to keep at least the Target Bitrate by filling up the remaining space in a sequence with empty information."
-AMF.H264.FrameSkipping="Frame Skipping"
-AMF.H264.FrameSkipping.Description="Frame Skipping allows the encoder to drop frames in order to meet Target Bitrate requirements.\nWhen the encoder drops a frame it instead insert a repeat-last-frame NAL into the stream.\nCan help with very low Target Bitrates."
-AMF.H264.EnforceHRDCompatibility="Enforce HRD Compatibility"
-AMF.H264.EnforceHRDCompatibility.Description="Enforce Hypothetical Reference Decoder restrictions which limit the maximum QP value change within a frame."
-# - Picture Control Properties
-AMF.H264.KeyframeInterval="Keyframe Interval"
-AMF.H264.KeyframeInterval.Description="Defines the distance between Keyframes in seconds. Also controls GOP-sequence size."
-AMF.H264.IDRPeriod="IDR Period"
-AMF.H264.IDRPeriod.Description="Defines the distance between Instantaneous Decoding Refreshes (IDR) in frames. Also controls GOP-sequence size."
-AMF.H264.BFrame.Pattern="B-Frames"
-AMF.H264.BFrame.Pattern.Description="The amount of B-Frames to use while encoding.\nSupported by 2nd and 3rd Generation VCE cards. Negative impact on encoding performance."
-AMF.H264.BFrame.DeltaQP="Delta QP for B-Frames"
-AMF.H264.BFrame.DeltaQP.Description="Delta QP value to the last I- or P-Frame for non-referenceable B-Frames."
-AMF.H264.BFrame.Reference="Referenceable B-Frames"
-AMF.H264.BFrame.Reference.Description="Allow a B-Frame to also use B-Frames as reference, instead of just P- and I-Frames."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP for referenceable B-Frames"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Delta QP value to the last I- or P-Frame for referenceable B-Frames."
-AMF.H264.DeblockingFilter="Deblocking Filter"
-AMF.H264.DeblockingFilter.Description="Sets the flag that the decoder is allowed to use a Deblocking Filter for the encoded stream."
-# - Miscellaneous Properties
-AMF.H264.ScanType="Scan Type"
-AMF.H264.ScanType.Description="Which scanning method to use, always leave this on '\@AMF.H264.ScanType.Progressive\@'."
-AMF.H264.ScanType.Progressive="Progressive"
-AMF.H264.ScanType.Interlaced="Interlaced"
-AMF.H264.MotionEstimation="Motion Estimation"
-AMF.H264.MotionEstimation.Description="Motion Estimation allows the encoder to reduce needed bitrate by estimating where a pixel went."
-AMF.H264.MotionEstimation.None="None"
-AMF.H264.MotionEstimation.Half="Half-Pixel"
-AMF.H264.MotionEstimation.Quarter="Quarter-Pixel"
-AMF.H264.MotionEstimation.Both="Half- & Quarter-Pixel"
+Utility.Default="Default"
+Utility.Automatic="Automatic"
+Utility.Manual="Manual"
+Utility.Switch.Disabled="Disabled"
+Utility.Switch.Enabled="Enabled"
+Preset="Preset"
+Preset.ResetToDefaults="Reset to Defaults"
+Preset.Recording="Recording"
+Preset.HighQuality="High Quality"
+Preset.Indistinguishable="Indistinguishable"
+Preset.Lossless="Lossless"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+# Static
+Usage="Usage"
+Usage.Description="What usage AMF should be tuned for:\n- '\@Usage.Transcoding\@' is general purpose transcoding (recommended),\n- '\@Usage.UltraLowLatency\@' is for really low latency encoding,\n- '\@Usage.LowLatency\@' is similar to above with a slightly higher latency.\nStreaming only supports '\@Usage.Transcoding\@', all other values can be used for recording."
+Usage.Transcoding="Transcoding"
+Usage.UltraLowLatency="Ultra Low Latency"
+Usage.LowLatency="Low Latency"
+Usage.Webcam="Webcam"
+QualityPreset="Quality Preset"
+QualityPreset.Description="What Quality Preset AMF should attempt to target:\n- '\@QualityPreset.Speed\@' is the fastest but has the worst quality,\n- '\@QualityPreset.Balanced\@' is a balanced mix of both,\n- '\@QualityPreset.Quality\@' gives the best quality for a given bitrate."
+QualityPreset.Speed="Speed"
+QualityPreset.Balanced="Balanced"
+QualityPreset.Quality="Quality"
+Profile="Profile"
+Profile.Description="What Profile to encode with. Sorted from best supported (top) to best quality (bottom)."
+ProfileLevel="Profile Level"
+ProfileLevel.Description="What Profile Level to use. It is best to leave this at \@Utility.Automatic\@."
+Tier="Tier"
+Tier.Description="What Tier to encode at. 'High' targets high bitrate/bandwidth uses while 'Main' is aimed at mainstream media."
+AspectRatio="Aspect Ratio"
+AspectRatio.Description="Which Aspect Ratio should be written into the output file."
+CodingType="Coding Type"
+CodingType.Description="Which type of coding to use:\n* '\@Utility.Automatic\@' lets AMF decide (recommended).\n* 'CALVC' (Context-Adaptive Variable-Length Coding) is faster, but larger.\n* 'CABAC' (Context-Adaptive Binary Arithmetic Coding) is slower, but smaller."
+MaximumReferenceFrames="Maximum Reference Frames"
+MaximumReferenceFrames.Description="How many Frames the encoder may reference at most when encoding. Has a direct impact on encoding quality."
+# Rate Control
+RateControlMethod="Rate Control Method"
+RateControlMethod.Description="What rate control method should be used:\n- '\@RateControlMethod.CQP\@' assigns fixed I-/P-/B-Frame QP values,\n- '\@RateControlMethod.CBR\@' stays at the given Target Bitrate (using Filler Data) (recommended for streaming),\n- '\@RateControlMethod.VBR\@' stays below the given Peak Bitrate,\n- '\@RateControlMethod.VBRLAT\@' stays close to the Target Bitrate if GPU latency and load allow for it, otherwise will use higher bitrate (recommended for recording)."
+RateControlMethod.CQP="Constant QP (CQP)"
+RateControlMethod.CBR="Constant Bitrate (CBR)"
+RateControlMethod.VBR="Variable Bitrate (Peak Constrained) (VBR)"
+RateControlMethod.VBRLAT="Variable Bitrate (Latency Constrained) (VBRLAT)"
+PrePassMode="Pre-Pass Mode"
+PrePassMode.Description="Pre-Pass is a secondary bitrate distribution pass that allows for better distribution of the bitrate within a sequence, however the effects of this may vary from card to card."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (Quarter Size)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (Half Size)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (Full Size)"
+Bitrate.Target="Target Bitrate"
+Bitrate.Target.Description="Bitrate to attempt to achieve in the overall sequence."
+Bitrate.Peak="Peak Bitrate"
+Bitrate.Peak.Description="Bitrate to attempt to maximally peak to in the overall sequence."
+QP.IFrame="I-Frame QP"
+QP.IFrame.Description="Fixed QP value to use for I-Frames."
+QP.PFrame="P-Frame QP"
+QP.PFrame.Description="Fixed QP value to use for P-Frames."
+QP.BFrame="B-Frame QP"
+QP.BFrame.Description="Fixed QP value to use for B-Frames."
+QP.Minimum="Minimum QP"
+QP.Minimum.Description="Lowest QP value to use in a Frame."
+QP.IFrame.Minimum="Minimum I-Frame QP"
+QP.IFrame.Minimum.Description="Lowest QP value to use in an I-Frame."
+QP.PFrame.Minimum="Minimum P-Frame QP"
+QP.PFrame.Minimum.Description="Lowest QP value to use in a P-Frame."
+QP.Maximum="Maximum QP"
+QP.Maximum.Description="Highest QP value to use in a Frame."
+QP.IFrame.Maximum="Maximum I-Frame QP"
+QP.IFrame.Maximum.Description="Highest QP value to use in an I-Frame."
+QP.PFrame.Maximum="Maximum P-Frame QP"
+QP.PFrame.Maximum.Description="Highest QP value to use in a P-Frame."
+FillerData="Filler Data"
+FillerData.Description="Enabling Filler Data allows the encoder to keep at least the \@Bitrate.Target\@ by filling up the remaining space in a sequence with empty information."
+FrameSkipping="Frame Skipping"
+FrameSkipping.Description="Frame Skipping allows the encoder to skip frames in order to meet the \@Bitrate.Target\@ requirements.\nWhen the encoder skips a frame it instead insert a repeat-last-frame NAL into the stream.\nCan help with very low \@Bitrate.Target\@ requirements."
+FrameSkipping.Period="Skip-Frame Period"
+FrameSkipping.Period.Description="Interval (in Frames) between skipped frames. This can be used to simulate a lower framerate than OBS is set to while still having a valid stream."
+FrameSkipping.Behaviour="Skip-Frame Behaviour"
+FrameSkipping.Behaviour.Description="Define how Frame Skipping behaves."
+FrameSkipping.SkipNth="Skip only every Nth frame"
+FrameSkipping.KeepNth="Keep only every Nth frame"
+VBAQ="VBAQ"
+VBAQ.Description="Enable the use of 'Variance Based Adaptive Quantization' (VBAQ) which is based on pixel variance to distribute bitrate better.\nIt works on the idea that the human visual system is less sensitive to artifacts in highly textured areas and thus will push the bitrate towards smoother surfaces.\nEnabling this may lead to improvements in subjective quality with certain content."
+EnforceHRD="Enforce HRD"
+EnforceHRD.Description="Enforce the use of a Hypothetical Reference Decoder which is used to verify that the output bitstream is correct."
+# VBV Buffer
+VBVBuffer="VBV Buffer"
+VBVBuffer.Description="What method should be used to determine the VBV Buffer Size:\n- '\@Utility.Automatic\@' calculates the size using a strictness constraint,\n- '\@Utility.Manual\@' allows the user to control the size.\nVBV (Video Buffering Verifier) Buffer is used by certain Rate Control Methods to keep the overall bitrate within the given constraints."
+VBVBuffer.Strictness="VBV Buffer Strictness"
+VBVBuffer.Strictness.Description="Determines the strictness of the VBV Buffer, with 100% being as strict as possible and 0% being unrestricted."
+VBVBuffer.Size="VBV Buffer Size"
+VBVBuffer.Size.Description="The size of the VBV Buffer which is used for Bitrate control in a sequence."
+VBVBuffer.InitialFullness="VBV Buffer Initial Fullness"
+VBVBuffer.InitialFullness.Description="How full the VBV Buffer initially is (in %), will only affect the initial sequence of encoding."
+# Picture Control
+Interval.Keyframe="Keyframe Interval"
+Interval.Keyframe.Description="Interval (in Seconds) between Keyframes."
+Period.IDR.H264="IDR Period (in Frames)"
+Period.IDR.H264.Description="Defines the distance between Instantaneous Decoding Refreshes (IDR) in frames."
+Period.IDR.H265="IDR Period (in GOPs)"
+Period.IDR.H265.Description="Defines the distance between Instantaneous Decoding Refreshes (IDR) in GOPs."
+Interval.IFrame="I-Frame Interval"
+Interval.IFrame.Description="Interval (in Seconds) between I-Frames. I-Frames override P-Frames and B-Frames."
+Period.IFrame="I-Frame Period (in Frames)"
+Period.IFrame.Description="Distance (in Frames) between I-Frames. I-Frames override P-Frames and B-Frames."
+Interval.PFrame="P-Frame Interval"
+Interval.PFrame.Description="Interval (in Seconds) between P-Frames. P-Frames override B-Frames."
+Period.PFrame="P-Frame Period (in Frames)"
+Period.PFrame.Description="Distance (in Frames) between P-Frames. P-Frames override B-Frames."
+Interval.BFrame="B-Frame Interval"
+Interval.BFrame.Description="Interval (in Seconds) between B-Frames."
+Period.BFrame="B-Frame Period (in Frames)"
+Period.BFrame.Description="Distance (in Frames) between B-Frames."
+GOP.Type="GOP Type"
+GOP.Type.Description="Which Type of GOP should be used:\n- '\@GOP.Type.Fixed\@' will always use fixed distances between each GOP.\n- '\@GOP.Type.Variable\@' allows for GOPs of varying sizes, depending on what is needed.\n'\@GOP.Type.Fixed\@' is how the H264 implementation works and best for local network streaming, while '\@GOP.Type.Variable\@' is best for low size high quality recordings."
+GOP.Type.Fixed="Fixed"
+GOP.Type.Variable="Variable"
+GOP.Size="GOP Size"
+GOP.Size.Description="Size of a GOP (Group Of Pictures) in Frames."
+GOP.Size.Minimum="GOP Size Minimum"
+GOP.Size.Minimum.Description="Minimum Size of a GOP (Group Of Pictures) in Frames."
+GOP.Size.Maximum="GOP Size Maximum"
+GOP.Size.Maximum.Description="Maximum Size of a GOP (Group Of Pictures) in Frames."
+GOP.Alignment="GOP Alignment"
+GOP.Alignment.Description="Experimental, Effects are Unknown. Use at your own Risk."
+BFrame.Pattern="B-Frame Pattern"
+BFrame.Pattern.Description="The amount of B-Frames to use while encoding.\nSupported by 2nd and 3rd Generation VCE cards. Negative impact on encoding performance."
+BFrame.DeltaQP="B-Frame Delta QP"
+BFrame.DeltaQP.Description="Delta QP value to the last I- or P-Frame for non-referenceable B-Frames."
+BFrame.Reference="B-Frame Reference"
+BFrame.Reference.Description="Allow a B-Frame to also use B-Frames as reference, instead of just P- and I-Frames."
+BFrame.ReferenceDeltaQP="B-Frame Reference Delta QP"
+BFrame.ReferenceDeltaQP.Description="Delta QP value to the last I- or P-Frame for referenceable B-Frames."
+DeblockingFilter="Deblocking Filter"
+DeblockingFilter.Description="Allow the decoder to apply a Deblocking Filter."
+MotionEstimation="Motion Estimation"
+MotionEstimation.Description="Motion Estimation allows the encoder to reduce needed bitrate by estimating where a pixel went."
+MotionEstimation.Quarter="Quarter-Pixel"
+MotionEstimation.Half="Half-Pixel"
+MotionEstimation.Full="Quarter- & Half-Pixel"
+# System
+Video.API="Video API"
+Video.API.Description="What API should the backend use?"
+Video.Adapter="Video Adapter"
+Video.Adapter.Description="On what Adapter should we attempt to encode on?"
+OpenCL.Transfer="OpenCL Transfer"
+OpenCL.Transfer.Description="Should OpenCL be used for Frame transfer to the GPU?"
+OpenCL.Conversion="OpenCL Conversion"
+OpenCL.Conversion.Description="Should OpenCL be used for Frame conversion on the GPU?"
+AsynchronousQueue="Asynchronous Queue"
+AsynchronousQueue.Description="Asynchronously handle submitting frames and retrieving data from the Encoder."
+AsynchronousQueue.Size="Asynchronous Queue Size"
+AsynchronousQueue.Size.Description="Maximum size of the frame and packet queue before dropping either."
+View="View Mode"
+View.Description="What properties should be shown?\nUsing '\@View.Master\@' will disqualify you from receiving support."
+View.Basic="Basic"
+View.Advanced="Advanced"
+View.Expert="Expert"
+View.Master="Master"
+Debug="Debug"
+Debug.Description="Enable additional Debug messages. Requires that you run Open Broadcaster Software Studio with the command line '--verbose --log_unfiltered' (remove the ')."
# - Experimental Properties
-AMF.H264.CodingType="Coding Type"
-AMF.H264.CodingType.Description="Which type of coding to use:\n* \@AMF.Util.Default\@ lets AMF decide (recommended).\n* CALVC (Context-Adaptive Variable-Length Coding) is faster, but larger.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) is slower, but smaller."
AMF.H264.MaximumLTRFrames="Maximum LTR Frames"
AMF.H264.MaximumLTRFrames.Description="Long Term Reference (LTR) Frames are a feature that allows the encoder to flag certain frames in a sequence as referencable for a long time.\nLTR Frames can't be used with B-Frames and the encoder will disable B-Frames if these are used."
AMF.H264.MaximumAccessUnitSize="Maximum Access Unit Size"
AMF.H264.HeaderInsertionSpacing.Description="How many frames should be between NAL headers."
AMF.H264.WaitForTask="Wait For Task"
AMF.H264.WaitForTask.Description="Unknown, Experimental"
-AMF.H264.PreAnalysisPass="Pre Analysis Pass"
-AMF.H264.PreAnalysisPass.Description="Unknown, Experimental"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Unknown, Experimental"
-AMF.H264.GOPSize="GOP Size"
-AMF.H264.GOPSize.Description="Unknown, Experimental"
-AMF.H264.GOPAlignment="GOP Alignment"
-AMF.H264.GOPAlignment.Description="Unknown, Experimental"
-AMF.H264.MaximumReferenceFrames="Maximum Reference Frames"
-AMF.H264.MaximumReferenceFrames.Description="Unknown, Experimental"
AMF.H264.SlicesPerFrame="Slices Per Frame"
AMF.H264.SlicesPerFrame.Description="How many I-Frame slices should be stored with each frame?\nA value of zero lets the encoder decide on the fly.\nIntra-Refresh encoding is used for faster playback and seeking."
AMF.H264.SliceMode="Slice Mode"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Unknown, Experimental"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Intra-Refresh Macroblocks per Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="How many Macroblocks should be stored in each slot?\nA value of 0 disables this feature.\nIntra-Refresh encoding is used for faster playback and seeking."
-# - System Properties
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAPI.Description="Which API to use for encoding."
-AMF.H264.VideoAdapter="Video Adapter"
-AMF.H264.VideoAdapter.Description="Which Adapter to use for encoding."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Should the Encoder use OpenCL to submit the individual frames?"
-AMF.H264.View="View Mode"
-AMF.H264.View.Description="What properties should be shown?\nUsing '\@AMF.H264.View.Master\@' will disqualify you from receiving support."
-AMF.H264.View.Basic="Basic"
-AMF.H264.View.Advanced="Advanced"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Master"
-AMF.H264.Debug="Debug"
-AMF.H264.Debug.Description="Enable additional debug logging, should be active whenever you need support with this encoder."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/es-ES.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/es-ES.ini
Changed
-AMF.Util.Default="Por defecto"
-AMF.Util.Automatic="Automático"
-AMF.Util.Manual="Manual"
-AMF.Util.Toggle.Disabled="Deshabilitado"
-AMF.Util.Toggle.Enabled="Habilitado"
-AMF.H264.Preset="Preajuste"
-AMF.H264.Preset.ResetToDefaults="Restablecer por defecto"
-AMF.H264.Preset.Recording="Grabación"
-AMF.H264.Preset.HighQuality="Alta Calidad"
-AMF.H264.Preset.Indistinguishable="Indistinguible"
-AMF.H264.Preset.Lossless="Sin pérdidas"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Uso"
-AMF.H264.Usage.Description="A que uso debe ajustarse AMF:\n- 'Codificación' es para uso generalizado (recomendado),\n- 'Ultra Baja Latencia' es para codificación de muy baja latencia,\n- 'Baja Latencia' es similar a la anterior con algo mas de latencia.\nLas emisiones en directo solo soportan 'Codificación', todos los valores pueden ser usados para grabar."
-AMF.H264.Usage.Transcoding="Codificación"
-AMF.H264.Usage.UltraLowLatency="Latencia ultra baja"
-AMF.H264.Usage.LowLatency="Latencia baja"
-AMF.H264.QualityPreset="Calidad del perfil"
-AMF.H264.QualityPreset.Description="Que calidad del perfil de AMD se debe intentar conseguir:\n- 'Velocidad' es la mas rápida pero la que peor calidad obtiene,\n- 'Equilibrado' está entre 'Velocidad' y 'Calidad' ofreciendo un balance entre los dos,\n- 'Calidad' ofrece la mejor calidad posible para una determinada tasa de bits."
-AMF.H264.QualityPreset.Speed="Velocidad"
-AMF.H264.QualityPreset.Balanced="Equilibrado"
-AMF.H264.QualityPreset.Quality="Calidad"
-AMF.H264.Profile="Perfil del Códec"
-AMF.H264.Profile.Description="Que perfil H.264 a utilizar para la codificación:\n- 'Baseline' tiene el mayor soporte en las plataformas,\n- 'Main' es compatible con dispositivos algo mas anticuados (recomendado si la emisión va dirigida a dispositivos móviles),\n- 'High' es compatible con los dispositivos actuales (recomendado)."
-AMF.H264.ProfileLevel="Nivel del Perfil"
-AMF.H264.ProfileLevel.Description="Nivel de perfil H.264 a utilizar para la codificación:\n- 'Automático' calcula el mejor nivel de perfil para cierta velocidad y tamaño de fotogramas,\n- '4.1' soporta 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' soporta 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' soporta 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' soporta 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' soporta 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Método de control del flujo"
-AMF.H264.RateControlMethod.Description="Qué método de control de flujo debe ser usado:\n- '\@AMF.H264.RateControlMethod.CQP\@' asigna valores fijos de QP en I-/P-/B-Frames (Parámetro de cuantización),\n- '\@AMF.H264.RateControlMethod.CBR\@' se mantiene en la tasa de bits objetivo (usando Datos de relleno) (recomendado para emisiones en directo),\n- '\@AMF.H264.RateControlMethod.VBR\@' se mantiene por debajo de un pico de tasa de bits,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' se mantiene cerca de la tasa de bits deseada si la latencia y carga de la GPU lo permite, si no se aumentará la tasa de bits (recomendado para grabaciones)."
-AMF.H264.RateControlMethod.CQP="QP constante (CQP)"
-AMF.H264.RateControlMethod.CBR="Flujo constante (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Flujo variable (pico restringido) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Flujo variable (latencia restringida) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Tasa de bits deseada"
-AMF.H264.Bitrate.Target.Description="Tasa de bits a intentar alcanzar en la secuencia general."
-AMF.H264.Bitrate.Peak="Pico de tasa de bits"
-AMF.H264.Bitrate.Peak.Description="Tasa de bits a intentar alcanzar como pico máximo en la secuencia general."
-AMF.H264.QP.Minimum="QP Mínimo"
-AMF.H264.QP.Minimum.Description="Valor mínimo de QP (parámetro de cuantización) a utilizar en un fotograma."
-AMF.H264.QP.Maximum="QP Máximo"
-AMF.H264.QP.Maximum.Description="Valor máximo de QP (parámetro de cuantización) a utilizar en un fotograma."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Valor fijo de QP para I-Frames."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Valor fijo de QP para P-Frames."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Valor de QP Fijo (parámetro de cuantización) a usar por B-Frames."
-AMF.H264.VBVBuffer="Buffer VBV"
-AMF.H264.VBVBuffer.Description="Que método se debe usar para determinar el tamaño del buffer VBV:\n- 'Automático' calcula el tamaño usando una restricción estricta,\n- 'Manual' permite al usuario controlar el tamaño.\nEl buffer VBV (Verificador de Buffering de Video) es usado por ciertos métodos de control del flujo para mantener la tasa de bits dentro de los parámetros establecidos."
-AMF.H264.VBVBuffer.Strictness="Estricticidad del Buffer VBV"
-AMF.H264.VBVBuffer.Strictness.Description="Determina la rigidez del Buffer VBV, con 100% siendo tan estricto como sea posible y 0% sin restricción."
-AMF.H264.VBVBuffer.Size="Tamaño de buffer VBV"
-AMF.H264.VBVBuffer.Size.Description="Tamaño del Buffer VBV que se utiliza para el control de Bitrate en una secuencia."
-AMF.H264.VBVBuffer.Fullness="Amplitud del Buffer VBV"
-AMF.H264.VBVBuffer.Fullness.Description="Como de lleno es el buffer VMV inicialmente, solo afectará a la secuencia inicial de la codificación."
-AMF.H264.FillerData="Datos de relleno"
-AMF.H264.FillerData.Description="Habilitando Datos de relleno se permite al codificador mantener por lo menos la tasa de bits deseada rellenando el espacio que falta con información sin valor."
-AMF.H264.FrameSkipping="Omisión de fotogramas"
-AMF.H264.FrameSkipping.Description="Omisión de fotogramas permite al codificador saltar fotogramas para cumplir con el requerimiento de la tasa de bits objetivo.\nCuando el codificador salta un fotograma insertará un NAL que repetirá el ultimo fotograma codificado en el stream.\nPuede ayudar con tasa de bits objetivo muy bajas."
-AMF.H264.EnforceHRDCompatibility="Forzar compatibilidad con HRD"
-AMF.H264.EnforceHRDCompatibility.Description="Forzar las restricciones del decodificador hipotético de referencia que limitan el cambio de valor máximo de QP dentro de un fotograma.\nNo recomendado para grabación o emisión en directo y solo se debe usar cuando el objetivo son dispositivos antiguos que solo tienen decodificadores de referencia por software."
-AMF.H264.KeyframeInterval="Intervalo de fotogramas clave"
-AMF.H264.KeyframeInterval.Description="Cuantos segundos deben haber entre fotogramas que no se pueden descartar.\nTambién controla el tamaño de la secuencia (GOP)."
-AMF.H264.IDRPeriod="Periodo IDR"
-AMF.H264.IDRPeriod.Description="Define la distancia entre Instantaneous Decoding Refreshes (IDR) en Fotogramas. También controla el tamaño de la secuencia del GOP."
-AMF.H264.BFrame.Pattern="B-Frames"
-AMF.H264.BFrame.Pattern.Description="La cantidad de B-Frames a usar mientras se codifica.\nSoportado con tarjetas de 2 º y 3 º generación VCE. Impacto negativo en el rendimiento de codificación."
-AMF.H264.BFrame.DeltaQP="Delta QP para B-Frames"
-AMF.H264.BFrame.DeltaQP.Description="Valor Delta QP para el ultimo I- o P-Frame para B-Frames no referenciables."
-AMF.H264.BFrame.Reference="B-Frames referenciables"
-AMF.H264.BFrame.Reference.Description="Permitir a un B-Frame utilizar también B-Frames como referencia, en lugar de P - y I-Frames."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP para los fotogramas referenciables"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Valor Delta QP para el ultimo I- o P-Frame para B-Frames referenciables."
-AMF.H264.DeblockingFilter="Filtro de eliminación de bloques"
-AMF.H264.DeblockingFilter.Description="Establece el indicador de que el decodificador está permitido a usar el Filtro de eliminación de bloques para el stream codificado."
-AMF.H264.ScanType="Tipo de escaneo"
-AMF.H264.ScanType.Description="Que método de escaneo usar, dejar siempre en 'Progresivo'."
-AMF.H264.ScanType.Progressive="Progresivo"
-AMF.H264.ScanType.Interlaced="Entrelazado"
-AMF.H264.MotionEstimation="Estimación de movimiento"
-AMF.H264.MotionEstimation.Description="Estimación de movimiento permite al codificador reducir el flujo de datos necesario estimando de donde vienen los pixeles."
-AMF.H264.MotionEstimation.None="Ninguno"
-AMF.H264.MotionEstimation.Half="Mitad de Pixel"
-AMF.H264.MotionEstimation.Quarter="Cuarto de Pixel"
-AMF.H264.MotionEstimation.Both="Mitad y cuarto de Pixel"
-AMF.H264.CodingType="Tipo de codificación"
-AMF.H264.CodingType.Description="Qué tipo de codificación utilizar:\n* \@AMF.Util.Default\@ deja que AMF lo decida (recomendado).\n* CALVC (Context-Adaptive Variable-Length Coding) es más rápido, pero más grande.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) es más lento, pero más pequeño."
+Utility.Default="Por defecto"
+Utility.Automatic="Automático"
+Utility.Manual="Manual"
+Utility.Switch.Disabled="Deshabilitado"
+Utility.Switch.Enabled="Habilitado"
+Preset="Preselección"
+Preset.ResetToDefaults="Restablecer por defecto"
+Preset.Recording="Grabación"
+Preset.HighQuality="Alta Calidad"
+Preset.Indistinguishable="Indistinguible"
+Preset.Lossless="Sin pérdidas"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Uso"
+Usage.Description="A que uso debe ajustarse AMF:\n- '\@Usage.Transcoding\@' es para uso generalizado (recomendado),\n- '\@Usage.UltraLowLatency\@' es para codificación de muy baja latencia,\n- '\@Usage.LowLatency\@' es similar a la anterior con algo mas de latencia.\nLas emisiones en directo solo soportan '\@Usage.Transcoding\@', todos los valores pueden ser usados para grabar."
+Usage.Transcoding="Transcodificación"
+Usage.UltraLowLatency="Latencia ultra baja"
+Usage.LowLatency="Latencia baja"
+Usage.Webcam="Webcam"
+QualityPreset="Calidad del perfil"
+QualityPreset.Description="Qué preajuste de calidad debe AMF intentar usar:\n- '\@QualityPreset.Speed\@' es el más rápido pero tiene la peor calidad, \n-'\@QualityPreset.Balanced\@' es una mezcla equilibrada de ambos,\n- '\@QualityPreset.Quality\@' da la mejor calidad para un determinado bitrate."
+QualityPreset.Speed="Velocidad"
+QualityPreset.Balanced="Equilibrado"
+QualityPreset.Quality="Calidad"
+Profile="Perfíl"
+Profile.Description="Perfil con el que codificar. Ordenados de mejor soportado (arriba) a mejor calidad (abajo)."
+ProfileLevel="Nivel del Perfil"
+ProfileLevel.Description="Qué nivel de perfil a utilizar. Es mejor dejar esto en \@Utility.Automatic\@."
+Tier="Nivel"
+Tier.Description="Nivel en el que codificar. 'Alta' apunta a usos de alto bitrate/ancho de banda mientras 'Principal' está dirigido a medios de comunicación."
+AspectRatio="Relación de aspecto"
+AspectRatio.Description="Proporción de aspecto que debe escribirse en el archivo de salida."
+CodingType="Tipo de codificación"
+CodingType.Description="Qué tipo de codificación utilizar:\n* '\@Utility.Automatic\@' deja que AMF lo decida (recomendado).\n* 'CALVC' (Context-Adaptive Variable-Length Coding) es más rápido, pero más grande.\n* 'CABAC' (Context-Adaptive Binary Arithmetic Coding) es más lento, pero más pequeño."
+MaximumReferenceFrames="Fotogramas de referencia máximos"
+MaximumReferenceFrames.Description="Cuántos fotogramas el codificador puede hacer referencia como mucho al codificar, tiene un impacto directo en la calidad de codificación."
+RateControlMethod="Método de control del flujo"
+RateControlMethod.Description="Qué método de control de flujo debe ser usado:\n- '\@RateControlMethod.CQP\@' asigna valores fijos de QP en I-/P-/B-Frames,\n- '\@RateControlMethod.CBR\@' se mantiene en la tasa de bits objetivo (usando Datos de relleno) (recomendado para emisiones en directo),\n- '\@RateControlMethod.VBR\@' se mantiene por debajo de un pico de tasa de bits,\n- '\@RateControlMethod.VBRLAT\@' se mantiene cerca de la tasa de bits deseada si la latencia y carga de la GPU lo permite, si no se aumentará la tasa de bits (recomendado para grabaciones)."
+RateControlMethod.CQP="QP constante (CQP)"
+RateControlMethod.CBR="Flujo constante (CBR)"
+RateControlMethod.VBR="Flujo variable (pico restringido) (VBR)"
+RateControlMethod.VBRLAT="Flujo variable (latencia restringida) (VBRLAT)"
+PrePassMode="Modo de Pre-Paso"
+PrePassMode.Description="Pre-Paso es un pase de distribución de bitrate secundario que permite la mejor distribución de la velocidad de bits dentro de una secuencia, sin embargo sus efectos pueden variar según la tarjeta."
+PrePassMode.Quarter="\@Utility.switch.Enabled\@ (cuarto de tamaño)"
+PrePassMode.Half="\@Utility.switch.Enabled\@ (mitad de tamaño)"
+PrePassMode.Full="\@Utility.switch.Enabled\@ (tamaño completo)"
+Bitrate.Target="Tasa de bits deseada"
+Bitrate.Target.Description="Tasa de bits a intentar alcanzar en la secuencia general."
+Bitrate.Peak="Pico de tasa de bits"
+Bitrate.Peak.Description="Tasa de bits a intentar alcanzar como pico máximo en la secuencia general."
+QP.IFrame="I-Frame QP"
+QP.IFrame.Description="Valor fijo de QP para I-Frames."
+QP.PFrame="P-Frame QP"
+QP.PFrame.Description="Valor fijo de QP para P-Frames."
+QP.BFrame="B-Frame QP"
+QP.BFrame.Description="Valor fijo de QP para B-Frames."
+QP.Minimum="QP Mínimo"
+QP.Minimum.Description="Valor mas bajo de QP para utilizar en un fotograma."
+QP.IFrame.Minimum="I-Frame QP mínimo"
+QP.IFrame.Minimum.Description="Valor mas bajo de QP para utilizar en un I-Frame."
+QP.PFrame.Minimum="P-Frame QP mínimo"
+QP.PFrame.Minimum.Description="Valor mas bajo de QP para utilizar en un P-Frame."
+QP.Maximum="QP Máximo"
+QP.Maximum.Description="Valor mas alto de QP para utilizar en un fotograma."
+QP.IFrame.Maximum="I-Frame QP máximo"
+QP.IFrame.Maximum.Description="Valor mas alto de QP para utilizar en un I-Frame."
+QP.PFrame.Maximum="P-Frame QP máximo"
+QP.PFrame.Maximum.Description="Valor mas alto de QP para utilizar en un P-Frame."
+FillerData="Datos de relleno"
+FillerData.Description="Habilitando Datos de relleno se permite al codificador mantener por lo menos la \@Bitrate.Target\@ rellenando el espacio que falta con información sin valor."
+FrameSkipping="Omisión de fotogramas"
+FrameSkipping.Description="Omisión de fotogramas permite al codificador saltar fotogramas para cumplir con el requerimiento de la \@Bitrate.Target\@.\nCuando el codificador salta un fotograma insertará un NAL que repetirá el ultimo fotograma codificado en el stream.\nPuede ayudar con \@Bitrate.Target\@ muy bajas."
+VBAQ="VBAQ"
+VBAQ.Description="Habilitar el uso de 'Cuantizacion adaptativa basada en Varianza' (VBAQ) que se basa en la varianza del pixel para una mejor distribución del bitrate. \nFunciona con la idea de que el sistema visual humano es menos sensible a los artefactos en áreas altamente texturadas y así moverá el bitrate hacia superficies más suaves. \nHabilitando esto puede llevar a mejoras en la calidad subjetiva en cierto contenido."
+EnforceHRD="Forzar HRD"
+EnforceHRD.Description="Forzar el uso de un decodificador de referencia hipotético que se utiliza para verificar que el flujo de bits de salida es correcto."
+VBVBuffer="Buffer VBV"
+VBVBuffer.Description="Que método se debe usar para determinar el tamaño del buffer VBV:\n- '\@Utlity.Automatic\@' calcula el tamaño usando una restricción estricta,\n- '\@Utlity.Manual\@' permite al usuario controlar el tamaño.\nEl buffer VBV (Verificador de Buffering de Video) es usado por ciertos métodos de control del flujo para mantener la tasa de bits dentro de los parámetros establecidos."
+VBVBuffer.Strictness="Estricticidad del Buffer VBV"
+VBVBuffer.Strictness.Description="Determina la rigidez del Buffer VBV, con 100% siendo tan estricto como sea posible y 0% sin restricción."
+VBVBuffer.Size="Tamaño de buffer VBV"
+VBVBuffer.Size.Description="Tamaño del Buffer VBV que se utiliza para el control de Bitrate en una secuencia."
+VBVBuffer.InitialFullness="Amplitud Inicial del Buffer VBV"
+VBVBuffer.InitialFullness.Description="Como de lleno es el buffer VMV inicialmente (en %), solo afectará a la secuencia inicial de la codificación."
+KeyframeInterval="Intervalo de fotogramas clave"
+KeyframeInterval.Description="Intervalo (en segundos) entre fotogramas clave."
+H264.IDRPeriod="Período de IDR (en fotogramas)"
+H264.IDRPeriod.Description="Define la distancia entre Instantaneous Decoding Refreshes (IDR) en Fotogramas. También controla el tamaño de la secuencia del GOP."
+H265.IDRPeriod="Período de IDR (en GOP's)"
+H265.IDRPeriod.Description="Define la distancia entre Instantaneous Decoding Refreshes (IDR) en GOPs."
+GOP.Type="Tipo de GOP"
+GOP.Type.Description="Qué tipo de GOP se debe utilizar: \n - ' \@GOP. Type.Fixed\@' utilizará siempre distancias fijas entre cada GOP. \n - ' \@GOP. Type.Variable\@' permite GOPs de diferentes tamaños, dependiendo de lo que se necesite.\n'\@GOP. Type.Fixed\@' es cómo los trabajos de implementación H264 y lo mejor para los streaming en red local, mientras que ' \@GOP. Type.Variable\@' es el mejor para grabaciones de alta calidad de bajo tamaño."
+GOP.Type.Fixed="Fijo"
+GOP.Type.Variable="Variable"
+GOP.Size="Tamaño del GOP"
+GOP.Size.Description="Tamaño máximo de un GOP (grupo de imagenes) en fotogramas."
+GOP.Size.Minimum="Tamaño mínimo tamaño de GOP"
+GOP.Size.Minimum.Description="Tamaño mínimo de un GOP (grupo de imágenes) en fotogramas."
+GOP.Size.Maximum="Tamaño mínimo de GOP"
+GOP.Size.Maximum.Description="Tamaño máximo de un GOP (grupo de imagenes) en fotogramas."
+GOP.Alignment="Alineación del GOP"
+GOP.Alignment.Description="Experimental, los efectos son desconocidos. Usar bajo su propio riesgo."
+BFrame.Pattern="Patrón de B-Frame"
+BFrame.Pattern.Description="La cantidad de B-Frames a usar mientras se codifica.\nSoportado por tarjetas de 2ª y 3ª generación de VCE. Impacto negativo en el rendimiento de codificación."
+BFrame.DeltaQP="Delta QP en B-Frames"
+BFrame.DeltaQP.Description="Valor Delta QP para el ultimo I- o P-Frame para B-Frames no referenciables."
+BFrame.Reference="Referencia de B-Frames"
+BFrame.Reference.Description="Permitir a un B-Frame utilizar también B-Frames como referencia, en lugar de P - y I-Frames."
+BFrame.ReferenceDeltaQP="Delta QP en B-Frames de referencia"
+BFrame.ReferenceDeltaQP.Description="Valor Delta QP para el ultimo I- o P-Frame para B-Frames referenciables."
+DeblockingFilter="Filtro de eliminación de bloques"
+DeblockingFilter.Description="Permite al decodificador aplicar un filtro de eliminación de bloques."
+MotionEstimation="Estimación de movimiento"
+MotionEstimation.Description="Estimación de movimiento permite al codificador reducir el flujo de datos necesario estimando de donde vienen los pixeles."
+MotionEstimation.Quarter="Cuarto de Pixel"
+MotionEstimation.Half="Mitad de Pixel"
+MotionEstimation.Full="Cuarto y mitad de Pixel"
+Video.API="API de vídeo"
+Video.API.Description="¿Qué API debe usar el backend?"
+Video.Adapter="Adaptador de video"
+Video.Adapter.Description="¿Qué adaptador deberíamos tratamos de codificar?"
+OpenCL="OpenCL"
+OpenCL.Description="¿Se debe usar OpenCL para la presentación de fotogramas? Técnicamente es más rápido, pero provoca problemas con controladores de Intel (debido a bibliotecas incompatibles de OpenCL)."
+View="Modo de visualización"
+View.Description="¿Qué propiedades deben mostrarse? \nUsando '\@View.Master\@' te descalificará de recibir soporte."
+View.Basic="Básico"
+View.Advanced="Avanzado"
+View.Expert="Experto"
+View.Master="Maestro"
+Debug="Depuración"
+Debug.Description="Activar mensajes de depuración adicionales. Requiere que ejecute Open Broadcaster Software Studio con la línea de comandos '--verbose--log_unfiltered' (eliminar las ')."
AMF.H264.MaximumLTRFrames="Fotogramas LTR máximos"
AMF.H264.MaximumLTRFrames.Description="Fotogramas de referencia a largo plazo (LTR) son una característica que permite al codificador marcar ciertos frames en una secuencia como referentes por un largo tiempo.\nLos fotogramas LTR no pueden ser usados con B-Pictures y el codificador deshabilitará B-Pictures si se usa."
AMF.H264.MaximumAccessUnitSize="Tamaño máximo de la unidad de acceso"
AMF.H264.HeaderInsertionSpacing.Description="Cuantos fotogramas deben haber entre cabeceras NAL. No se recomienda cambiar de 0 (automatico)."
AMF.H264.WaitForTask="Esperar para la tarea"
AMF.H264.WaitForTask.Description="Desconocido, Experimental"
-AMF.H264.PreAnalysisPass="Pase de pre-análisis"
-AMF.H264.PreAnalysisPass.Description="Desconocido, Experimental"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Desconocido, Experimental"
-AMF.H264.GOPSize="Tamaño del GOP"
-AMF.H264.GOPSize.Description="Desconocido, Experimental"
-AMF.H264.GOPAlignment="Alineación del GOP"
-AMF.H264.GOPAlignment.Description="Desconocido, Experimental"
-AMF.H264.MaximumReferenceFrames="Fotogramas de referencia máximos"
-AMF.H264.MaximumReferenceFrames.Description="Desconocido, Experimental"
AMF.H264.SlicesPerFrame="Porciones por fotograma"
AMF.H264.SlicesPerFrame.Description="Cuantas porciones I-Frame deben ser almacenados en cada fotograma?\nUn valor de 0 permite al codificador decidir al vuelo.\nLa codificación Intra-Refresh es usada para una reproducción y exploración mas fluida."
AMF.H264.SliceMode="Modo de porciones"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Desconocido, Experimental"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Numero de macrobloques intra-refresh por Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Cuantos macrobloques deben ser almacenados en cada slot?\nUn valor de 0 deshabilita esta función.\nLa codificación Intra-Refresh es usada para una reproducción y exploración mas fluida."
-AMF.H264.VideoAPI="API de vídeo"
-AMF.H264.VideoAPI.Description="Que API usar para la codificación."
-AMF.H264.VideoAdapter="Adaptador de video"
-AMF.H264.VideoAdapter.Description="Que adaptador usar para la codificación."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="¿Debe el Codificador usar OpenCL para presentar los fotogramas individuales?"
-AMF.H264.View="Modo de Vista"
-AMF.H264.View.Description="Que propiedades deben ser visibles. No recibirás soporte si usas el modo de vista 'Experto' o 'Maestro'."
-AMF.H264.View.Basic="Básico"
-AMF.H264.View.Advanced="Avanzado"
-AMF.H264.View.Expert="Experto"
-AMF.H264.View.Master="Maestro"
-AMF.H264.Debug="Depurar"
-AMF.H264.Debug.Description="Habilita el registro de información de depuración adicional, debe ser activado cuando necesites ayuda con este codificador."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/eu-ES.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/eu-ES.ini
Changed
-AMF.Util.Default="Lehenetsia"
-AMF.Util.Automatic="Automatikoa"
-AMF.Util.Manual="Eskuz"
-AMF.Util.Toggle.Disabled="Ezgaituta"
-AMF.Util.Toggle.Enabled="Gaituta"
-AMF.H264.Preset="Aurrezarrita"
-AMF.H264.Preset.ResetToDefaults="Berrezarri balio lehenetsiak"
-AMF.H264.Preset.Recording="Grabatzen"
-AMF.H264.Preset.HighQuality="Kalitate handia"
-AMF.H264.Preset.Indistinguishable="Sumaezina"
-AMF.H264.Preset.Lossless="Galerarik gabe"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Erabilpena"
-AMF.H264.Usage.Description="AMFrentzat zer erabilerak aukeratu behar dira:\n- 'Transcoding' oro har proposatzen da transmisiorako (gomendatua),\n- 'Ultra Low Latency' latentzia oso baxuko kodetzerako,\n- 'Low Latency' aurrekoaren antzekoa da latentzia apur bat handiagoarekin.\nStreaming bakarrik onartzen du 'Transcoding', gainerako balio guztiak erabili daitezke grabaziorako."
-AMF.H264.Usage.Transcoding="Kodeketa"
-AMF.H264.Usage.UltraLowLatency="Latentzia oso txikia"
-AMF.H264.Usage.LowLatency="Latentzia txikia"
-AMF.H264.QualityPreset="Aurrezarritako kalitatea"
-AMF.H264.QualityPreset.Speed="Abiadura"
-AMF.H264.QualityPreset.Balanced="Orekatua"
-AMF.H264.QualityPreset.Quality="Kalitatea"
-AMF.H264.Profile="Profila"
-AMF.H264.ProfileLevel="Profilaren maila"
-AMF.H264.ProfileLevel.Description="H.264 profilaren zein maila erabili behar da kodetzeko:\n- 'Automatic' automatikoki kalkulatzen du emandako abiadura eta fotogramen tamainarako egokiena,\n- '4.1'-ek onartzen du 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2'-k onartzen du 1920x1080 60FPS, 1280x720 120FPS\n- '5.0'-k onartzen du 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1'-k onartzen du 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2'-k onartzen du 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Emaria kontrolatzeko metodoa"
-AMF.H264.RateControlMethod.CQP="Konstantea QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Bit emari konstantea (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Bit emari aldakorra (gailur mugatuak) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Bit emari aldakorra (Latentzia mugatua) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Bit emari helburua"
-AMF.H264.Bitrate.Peak="Gehienezko gailurra"
-AMF.H264.QP.Minimum="Gutxieneko QP"
-AMF.H264.QP.Maximum="Gehienezko QP"
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.VBVBuffer="VBV bufferra"
-AMF.H264.VBVBuffer.Strictness="VBV bufferraren zorroztasuna"
-AMF.H264.VBVBuffer.Size="VBV bufferraren tamaina"
-AMF.H264.VBVBuffer.Fullness="VBV bufferraren betetasuna"
-AMF.H264.FillerData="Datu betegarria"
-AMF.H264.FrameSkipping="Fotogramen saltoa"
-AMF.H264.EnforceHRDCompatibility="Behartu HRD bateragarritasuna"
-AMF.H264.KeyframeInterval="Gako fotogramen tartea"
-AMF.H264.IDRPeriod="IDR periodoa"
-AMF.H264.DeblockingFilter="Desblokeoko iragazkia"
-AMF.H264.ScanType="Eskaneatze mota"
-AMF.H264.ScanType.Description="Erabili behar den eskaneatze metodoa, utzi beti '\@AMF.H264.ScanType.Progressive\@'."
-AMF.H264.ScanType.Progressive="Progresiboa"
-AMF.H264.ScanType.Interlaced="Gurutzelarkatua"
-AMF.H264.MotionEstimation="Mugimenduaren estimazioa"
-AMF.H264.MotionEstimation.None="Ezer ez"
-AMF.H264.MotionEstimation.Half="Pixel erdia"
-AMF.H264.MotionEstimation.Quarter="Pixel laurdena"
-AMF.H264.MotionEstimation.Both="Pixel erdia eta laurdena"
-AMF.H264.CodingType="Kodetze mota"
+Utility.Default="Lehenetsia"
+Utility.Automatic="Automatikoa"
+Utility.Manual="Eskuz"
+Utility.Switch.Disabled="Ezgaituta"
+Utility.Switch.Enabled="Gaituta"
+Preset="Aurrezarrita"
+Preset.ResetToDefaults="Berrezarri balio lehenetsiak"
+Preset.Recording="Grabatzen"
+Preset.HighQuality="Kalitate handia"
+Preset.Indistinguishable="Sumaezina"
+Preset.Lossless="Galerarik gabe"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Erabilpena"
+Usage.Transcoding="Transkodetzen"
+Usage.UltraLowLatency="Latentzia ultra-txikia"
+Usage.LowLatency="Latentzia txikia"
+Usage.Webcam="Web-kamera"
+QualityPreset="Aurrezarritako kalitatea"
+QualityPreset.Speed="Abiadura"
+QualityPreset.Balanced="Orekatua"
+QualityPreset.Quality="Kalitatea"
+Profile="Profila"
+Profile.Description="Zein profila erabili nahi duzu kodetzeko, laguntza zabalduenetik kalitate handienera ordenatuta."
+ProfileLevel="Profil maila"
+ProfileLevel.Description="Zein profil maila erabili nahi duzu kodetzeko, gomendatzen da uztea \@Utility.Automatic\@"
AMF.H264.MaximumLTRFrames="Gehienezko LTR fotogramak"
AMF.H264.MaximumAccessUnitSize="Sarbide unitatearen gehienezko tamaina"
AMF.H264.HeaderInsertionSpacing="Goiburuak txertatzeko tartea"
AMF.H264.WaitForTask.Description="Ezezaguna, esperimentala"
-AMF.H264.PreAnalysisPass.Description="Ezezaguna, esperimentala"
-AMF.H264.VBAQ.Description="Ezezaguna, esperimentala"
-AMF.H264.GOPSize.Description="Ezezaguna, esperimentala"
-AMF.H264.GOPAlignment.Description="Ezezaguna, esperimentala"
-AMF.H264.MaximumReferenceFrames.Description="Ezezaguna, esperimentala"
AMF.H264.SlicesPerFrame="Zatiak fotogramako"
AMF.H264.SliceMode.Description="Ezezaguna, esperimentala"
AMF.H264.MaximumSliceSize.Description="Ezezaguna, esperimentala"
AMF.H264.SliceControlMode.Description="Ezezaguna, esperimentala"
AMF.H264.SliceControlSize.Description="Ezezaguna, esperimentala"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Ezezaguna, esperimentala"
-AMF.H264.VideoAPI="Bideo APIa"
-AMF.H264.VideoAPI.Description="Zein API erabili kodeketarako."
-AMF.H264.VideoAdapter="Bideo egokigailua"
-AMF.H264.VideoAdapter.Description="Zein egokigailu erabili kodeketarako."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Kodetzaileak OpenCL erabili behar du banakako fotogramak bidaltzeko?"
-AMF.H264.View="Ikuspegia"
-AMF.H264.View.Description="Zein propietate ikusi behar dira. Ez duzu laguntzarik jasoko 'Aditu' edo 'Maisu' ikuspegian."
-AMF.H264.View.Basic="Oinarrizkoa"
-AMF.H264.View.Advanced="Aurreratua"
-AMF.H264.View.Expert="Aditu"
-AMF.H264.View.Master="Maixu"
-AMF.H264.Debug="Garbiketa"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/fi-FI.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/fi-FI.ini
Changed
-AMF.Util.Default="Oletusarvo"
-AMF.Util.Automatic="Automaattinen"
-AMF.Util.Manual="Manuaalinen"
-AMF.Util.Toggle.Disabled="Pois käytöstä"
-AMF.Util.Toggle.Enabled="Käytössä"
-AMF.H264.Preset="Esiasetus"
-AMF.H264.Preset.ResetToDefaults="Palauta oletukset"
-AMF.H264.Preset.Recording="Tallennus"
-AMF.H264.Preset.HighQuality="Korkea laatu"
-AMF.H264.Preset.Indistinguishable="Erottamaton"
-AMF.H264.Preset.Lossless="Häviötön"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Käyttö"
-AMF.H264.Usage.Transcoding="Transkoodaus"
-AMF.H264.Usage.UltraLowLatency="Erittäin alhainen latenssi"
-AMF.H264.Usage.LowLatency="Alhainen latenssi"
-AMF.H264.QualityPreset="Laatuasetus"
-AMF.H264.QualityPreset.Speed="Nopeus"
-AMF.H264.QualityPreset.Balanced="Tasapainotettu"
-AMF.H264.QualityPreset.Quality="Laatu"
-AMF.H264.Profile="Profiili"
-AMF.H264.ProfileLevel="Profiilin taso"
-AMF.H264.RateControlMethod="Rate Control -tapa"
-AMF.H264.RateControlMethod.CQP="Pysyvä QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Jatkuva bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Vaihteleva bitrate (Piikkiin sidottu) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Vaihteleva bitrate (Latenssiin sidottu) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Bitrate-tavoite"
-AMF.H264.Bitrate.Peak="Bitrate-piikki"
-AMF.H264.QP.Minimum="Minimi QP"
-AMF.H264.QP.Maximum="Maksimi QP"
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.VBVBuffer="VBV-puskuri"
-AMF.H264.VBVBuffer.Size="VBV-puskurin koko"
-AMF.H264.FillerData="Täyttödata"
-AMF.H264.FrameSkipping="Ruudun ohitus"
-AMF.H264.EnforceHRDCompatibility="Pakota HRD-yhteensopivuus"
-AMF.H264.DeblockingFilter="Deblocking filtteri"
-AMF.H264.ScanType="Skannaustyyppi"
-AMF.H264.ScanType.Progressive="Progressiivinen"
-AMF.H264.ScanType.Interlaced="Lomitettu"
-AMF.H264.MotionEstimation="Liikkeen ennakointi"
-AMF.H264.MotionEstimation.None="Ei mitään"
-AMF.H264.View="Näyttötila"
-AMF.H264.View.Basic="Yksinkertainen"
-AMF.H264.View.Advanced="Kehittynyt"
-AMF.H264.View.Expert="Expertti"
-AMF.H264.View.Master="Jumalallinen"
-AMF.H264.Debug="Debug"
+Utility.Default="Oletusarvo"
+Utility.Automatic="Automaattinen"
+Utility.Manual="Manuaalinen"
+Utility.Switch.Disabled="Pois käytöstä"
+Utility.Switch.Enabled="Käytössä"
+Preset="Esiasetus"
+Preset.ResetToDefaults="Palauta oletukset"
+Preset.Recording="Tallennus"
+Preset.HighQuality="Korkea laatu"
+Preset.Indistinguishable="Erottamaton"
+Preset.Lossless="Häviötön"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Käyttö"
+Usage.Description="Mihin käyttöön AMF säädetään:\n- \"\@Usage.Transcoding\@\" on yleiseen käyttöön soveltuva transkooderi (suositeltu),\n- \"\@Usage.UltraLowLatency\@\" on todella matalan viiveen enkoodaukseen,\n- \"\@Usage.LowLatency\@\" on vastaavanlainen ylemmän kanssa, mutta hieman suuremmalla viiveellä.\nSuora lähettäminen tukee pelkästään \"\@Usage.Transcoding\@\", muita arvoja voidaan käyttää tallentamiseen."
+Usage.Transcoding="Transkoodaus"
+Usage.UltraLowLatency="Erittäin alhainen latenssi"
+Usage.LowLatency="Alhainen latenssi"
+Usage.Webcam="Web-kamera"
+QualityPreset="Laatuasetus"
+QualityPreset.Description="Minkä laadun esiasetusta AMF asetetaan tavoittelemaan:\n- \"\@QualityPreset.Speed\@\" on nopein, mutta huonoimmalla laadulla,\n- \"\@QualityPreset.Balanced\@\" on tasapainotettu sekoitus molempia,\n- \"\@QualityPreset.Quality\@\" antaa parhaan laadun annetulle bitratelle."
+QualityPreset.Speed="Nopeus"
+QualityPreset.Balanced="Tasapainotettu"
+QualityPreset.Quality="Laatu"
+Profile="Profiili"
+Profile.Description="Mitä profiilia käytetään enkoodaukseen. Lajiteltuna parhaiten tuetusta (yläosa) parhaaseen laatuun (alaosa)."
+ProfileLevel="Profiilin taso"
+ProfileLevel.Description="Mitä profiilin tasoa käytetään. Tämä on parasta jättää \@Utility.Automatic\@."
+Tier="Tier -taso"
+Tier.Description="Millä tier tasolla enkoodataan. \"High\" pyrkii korkean bitraten/siirtonopeuden käyttöön ja \"Main\" sen sijaan on suunnattu yleiseen mediaan."
+AspectRatio="Kuvasuhde"
+AspectRatio.Description="Mikä kuvasuhde kirjataan kohdetiedostoon."
+CodingType="Koodaustyyppi"
+CodingType.Description="Millaista koodausta käytetään:\n* \"\@Utility.Automatic\@\" antaa AMF:n päättää (suositeltu).\n* \"CALVC\" (Context-Adaptive Variable-Length Coding) on nopeampi, mutta vie enemmän tilaa.\n* \"CABAC\" (Context-Adaptive Binary Arithmetic Coding) on hitaampi, mutta vie vähemmän tilaa."
+MaximumReferenceFrames="Referenssi framejen enimmäismäärä"
+MaximumReferenceFrames.Description="Montako framea enkooderi voi referoida enintään enkoodatessa, asetuksella on suora vaikutus enkoodauksen laatuun."
+RateControlMethod="Rate Control -tapa"
+RateControlMethod.Description="Mitä rate control tapaa käytetään:\n- \"\@RateControlMethod.CQP\@\" käyttää kiinteitä I-/P-/B-Frame QP arvoja,\n- \"\@RateControlMethod.CBR\@\" pysyttelee annetussa kohde-bitratessa (käyttäen täytedataa) (suositeltu suoraan lähetykseen),\n- \"\@RateControlMethod.VBR\@\" pysyttelee annetun korkeimman bitraten alla,\n- \"\@RateControlMethod.VBRLAT\@\" pysyttelee lähellä kohde bitratea, mikäli GPU viive ja kuormitus antaa myöten, muussa tapauksessa käyttää korkeampaa bitratea (suositeltu tallentamiseen)."
+RateControlMethod.CQP="Jatkuva QP (CQP)"
+RateControlMethod.CBR="Jatkuva bitrate (CBR)"
+RateControlMethod.VBR="Vaihteleva bitrate (Piikkiin sidottu) (VBR)"
+RateControlMethod.VBRLAT="Vaihteleva bitrate (Latenssiin sidottu) (VBRLAT)"
+PrePassMode="Pre-Pass -tila"
+PrePassMode.Description="Pre-Pass on toissijainen bitraten jaottelun läpikäynti, joka mahdollistaa paremman bitraten jaottelun ketjussa, vaikutus voi kuitenkin vaihdella eri korteilla."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (neljännesosa)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (puolikas)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (kokonainen)"
+Bitrate.Target="Bitrate-tavoite"
+Bitrate.Target.Description="Bitrate johon tähdätään kokonaisketjussa."
+Bitrate.Peak="Bitrate-piikki"
+Bitrate.Peak.Description="Korkein bitrate johon tähdätään kokonaisketjussa."
+QP.IFrame="I-Frame QP"
+QP.IFrame.Description="Kiinteä QP-arvo I-frameille."
+QP.PFrame="P-Frame QP"
+QP.PFrame.Description="Kiinteä QP-arvo P-frameille."
+QP.BFrame="B-Frame QP"
+QP.BFrame.Description="Kiinteä QP-arvo B-frameille."
+QP.Minimum="Matalin QP"
+QP.Minimum.Description="Matalin QP-arvo frameille."
+QP.IFrame.Minimum="Matalin I-Frame QP"
+QP.IFrame.Minimum.Description="Matalin QP-arvo I-frameille."
+QP.PFrame.Minimum="Matalin P-Frame QP"
+QP.PFrame.Minimum.Description="Matalin QP-arvo P-frameille."
+QP.Maximum="Korkein QP"
+QP.Maximum.Description="Korkein QP-arvo frameille."
+QP.IFrame.Maximum="Korkein I-Frame QP"
+QP.IFrame.Maximum.Description="Korkein QP-arvo I-frameille."
+QP.PFrame.Maximum="Korkein P-Frame QP"
+QP.PFrame.Maximum.Description="Korkein QP-arvo P-frameille."
+FillerData="Täytedata"
+FillerData.Description="Täytedatan käyttöönotolla enkooderi pitää vähintään \@Bitrate.Target\@ täyttämällä jäljelle jääneen tilan ketjussa tyhjällä informaatiolla."
+FrameSkipping="Frame-ohitus"
+FrameSkipping.Description="Ruutujen ohituksella enkooderi voi pudottaa frameja täyttääkseen \@Bitrate.Target\@ vaatimukset.\nPudottaessaan framen enkooderi syöttää sen sijaan toistetun viimeisen NAL -framen striimiin.\nVoi auttaa jos käytössä on erittäin matala \@Bitrate.Target\@ vaatimus."
+VBAQ="VBAQ"
+VBAQ.Description="Ota käyttöön \"Variance Based Adaptive Quantization\" (VBAQ) joka pohjautuu pikselien vaihteluun bitraten jakamisen helpottamiseksi.\nSe toimii idealla jossa ihmissilmä on herkempi kuvan virheille tasaisissa kohdissa, jolloin bitrate kohdistetaan näille alueille.\nKäyttämällä tätä laatu saattaa parantua tietynlaista sisältöä näytettäessä."
+EnforceHRD="Pakota HRD"
+EnforceHRD.Description="Pakota hypoteettisen referenssi dekooderin käyttö, jota käytetään varmistamaan, että ulostulon bittistriimi on oikea."
+VBVBuffer="VBV-puskuri"
+VBVBuffer.Description="Mitä tapaa käytetään määrittämään VBV puskurin koko:\n- \"\@Utility.Automatic\@\" laskee koon käyttäen täsmällistä rajausta,\n- \"\@Utlity.Manual\@\" antaa käyttäjän ohjata kokoa.\nVBV (Video Buffering Verifier) puskuria käytetään joissakin Rate Control tavoissa pitämään kokonaisbitrate annettujen rajojen sisällä."
+VBVBuffer.Strictness="VBV-puskurin tiukkuus"
+VBVBuffer.Strictness.Description="Määrittelee VBV puskurin tiukkuuden, 100% on niin tarkka kuin mahdollista ja 0% on rajoittamaton."
+VBVBuffer.Size="VBV-puskurin koko"
+VBVBuffer.Size.Description="VBV-puskurin koko jota käytetään ketjussa bitraten säädössä."
+VBVBuffer.InitialFullness="VBV-puskurin alustava täysinäisyys"
+VBVBuffer.InitialFullness.Description="VBV-puskurin alustavan täysinäisyyden suuruus (%), vaikuttaa vain alustavaan enkoodausketjuun."
+KeyframeInterval="Keyframe -väli"
+KeyframeInterval.Description="Aikaväli (sekunteina) keyframejen välissä."
+H264.IDRPeriod="IDR-ajanjakso (frameina)"
+H264.IDRPeriod.Description="Määrittää etäisyyden Instantaneous Decoding Refreshes (IDR) välillä frameissa. Säätää myös GOP-ketjun kokoa."
+H265.IDRPeriod="IDR-ajanjakso (GOP)"
+H265.IDRPeriod.Description="Määrittää etäisyyden Instantaneous Decoding Refreshes (IDR) välillä GOP:seissa."
+GOP.Type="GOP-tyyppi"
+GOP.Type.Description="Millaista GOP-tyyppiä käytetään:\n- \"\@GOP.Type.Fixed\@\" käyttää aina kiinteää väliä jokaisen GOP:n välillä.\n- \"\@GOP.Type.Variable\@\" sallii eri kokoisia GOP-arvoja, tarpeesta riippuen.\n\"\@GOP.Type.Fixed\@\" on kuinka H264 toteutus toimii ja on paras paikallisen verkon suoralähetyksessä, kunnes taas \"\@GOP.Type.Variable\@\" on paras pienikokoiseen korkealaatuiseen tallentamiseen."
+GOP.Type.Fixed="Kiinteä"
+GOP.Type.Variable="Muuttuva"
+GOP.Size="GOP-koko"
+GOP.Size.Description="GOP:n (Group Of Pictures) koko frameissa."
+GOP.Size.Minimum="Matalin GOP koko"
+GOP.Size.Minimum.Description="Pienin GOP (Group of Pictures) koko frameissa."
+GOP.Size.Maximum="Suurin GOP koko"
+GOP.Size.Maximum.Description="Suurin GOP (Group of Pictures) koko frameissa."
+GOP.Alignment="GOP kohdistus"
+GOP.Alignment.Description="Kokeellinen, vaikutuksia ei ole tiedossa. Käytä omalla vastuullasi."
+BFrame.Pattern="B-Frame kuvio"
+BFrame.Pattern.Description="Käytettävien B-Framejen määrä enkoodauksessa.\nTuettu 2. ja 3. sukupolven VCE korteissa. Negatiivinen vaikutus enkoodauksen suorituskykyyn."
+BFrame.DeltaQP="B-Frame Delta QP"
+BFrame.DeltaQP.Description="Delta QP arvo viimeiseen I- tai P-Frameen ei-referoitavissa oleville B-Frameille."
+BFrame.Reference="B-Frame referenssi"
+BFrame.Reference.Description="Salli B-Framejen käyttää myös toisia B-Frameja referenssinä pelkkien P- ja I-Framejen sijasta."
+BFrame.ReferenceDeltaQP="B-Frame referenssi Delta QP"
+BFrame.ReferenceDeltaQP.Description="Delta QP arvo viimeiseen I- tai P-Frameen referoitavissa oleville B-Frameille."
+DeblockingFilter="Deblocking suodatin"
+DeblockingFilter.Description="Salli dekooderin käyttää Deblocking suodatinta."
+MotionEstimation="Liikkeen arviointi"
+MotionEstimation.Description="Liikkeen arviointi mahdollistaa enkooderin vähentää tarvittavaa bitratea arvioimalla pikselin kulkusuunnan."
+MotionEstimation.Quarter="Neljännes-pikseli"
+MotionEstimation.Half="Puoli-pikseli"
+MotionEstimation.Full="Neljännes- & Puolikas-pikseli"
+Video.API="Video API"
+Video.API.Description="Mitä APIa käytetään backendillä?"
+Video.Adapter="Näytönohjain"
+Video.Adapter.Description="Millä näytönohjaimella yritetään enkoodata?"
+OpenCL="OpenCL"
+OpenCL.Description="Käytetäänkö OpenCL:ää framejen toimittamiseen? Teknisesti nopeampi, mutta aiheuttaa ongelmia Intelin ajurien kanssa (Epäyhteensopivien OpenCL kirjastojen kanssa)."
+View="Katselutila"
+View.Description="Mitä ominaisuuksia näytetään?\nKäyttämällä \"\@View.Master\@\" menetät oikeutesi saada tukea."
+View.Basic="Yksinkertainen"
+View.Advanced="Kehittynyt"
+View.Expert="Expertti"
+View.Master="Mestarillinen"
+Debug="Debug"
+Debug.Description="Ota käyttöön lisää debuggaus viestejä. Vaatii Open Broadcaster Software Studion käynnistämisen komentorivillä \"--verbose --log_unfiltered\" (ilman heittomerkkejä)."
+AMF.H264.MaximumLTRFrames="Maksimimäärä LTR frameja"
+AMF.H264.MaximumLTRFrames.Description="Pitkäaikaiset referenssi framet (LTR) on ominaisuus, jolla enkooderi voi merkitä tiettyjä frameja ketjussa referoitaviksi pitemmän aikaa.\nLTR frameja ei voida käyttää B-Framejen kanssa ja enkooderi ottaa B-Framet pois käytöstä mikäli ne ovat käytössä."
+AMF.H264.MaximumAccessUnitSize="Maksimi Access Unit koko"
+AMF.H264.MaximumAccessUnitSize.Description="Suurin Access Unit koko NAL:lle."
+AMF.H264.HeaderInsertionSpacing="Header Insertion Spacing"
+AMF.H264.HeaderInsertionSpacing.Description="How many frames should be between NAL headers."
+AMF.H264.WaitForTask="Wait For Task"
+AMF.H264.WaitForTask.Description="Unknown, Experimental"
+AMF.H264.SlicesPerFrame="Slices Per Frame"
+AMF.H264.SlicesPerFrame.Description="How many I-Frame slices should be stored with each frame?\nA value of zero lets the encoder decide on the fly.\nIntra-Refresh encoding is used for faster playback and seeking."
+AMF.H264.SliceMode="Slice Mode"
+AMF.H264.SliceMode.Description="Unknown, Experimental"
+AMF.H264.MaximumSliceSize="Maximum Slice Size"
+AMF.H264.MaximumSliceSize.Description="Unknown, Experimental"
+AMF.H264.SliceControlMode="Slice Control Mode"
+AMF.H264.SliceControlMode.Description="Unknown, Experimental"
+AMF.H264.SliceControlSize="Slice Control Size"
+AMF.H264.SliceControlSize.Description="Unknown, Experimental"
+AMF.H264.IntraRefresh.NumberOfStripes="Intra-Refresh Number of Stripes"
+AMF.H264.IntraRefresh.NumberOfStripes.Description="Unknown, Experimental"
+AMF.H264.IntraRefresh.MacroblocksPerSlot="Intra-Refresh Macroblocks per Slot"
+AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="How many Macroblocks should be stored in each slot?\nA value of 0 disables this feature.\nIntra-Refresh encoding is used for faster playback and seeking."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/fr-FR.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/fr-FR.ini
Changed
-AMF.Util.Default="Défaut"
-AMF.Util.Automatic="Automatique"
-AMF.Util.Manual="Manuel"
-AMF.Util.Toggle.Disabled="Désactivé"
-AMF.Util.Toggle.Enabled="Activé"
-AMF.H264.Preset="Préréglage"
-AMF.H264.Preset.ResetToDefaults="Valeurs par défaut"
-AMF.H264.Preset.Recording="Enregistrement"
-AMF.H264.Preset.HighQuality="Qualité élevée"
-AMF.H264.Preset.Indistinguishable="Indifférenciable"
-AMF.H264.Preset.Lossless="Sans pertes"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Mode"
-AMF.H264.Usage.Description="Le mode d'optimisation utilisé par AMF :\n- '\@AMF.H264.Usage.Transcoding\@' : utilisation pour du transcodage (réglage recommandé),\n- '\@AMF.H264.Usage.UltraLowLatency\@' : encodage à très faible latence,\n- '\@AMF.H264.Usage.LowLatency\@' : quasiment identique au réglage ci-dessus, avec une latence légèrement plus élevée.\nLe streaming n'est possible qu'en mode '\@AMF.H264.Usage.Transcoding\@', l'enregistrement est possible avec tous les autres modes."
-AMF.H264.Usage.Transcoding="Transcodage"
-AMF.H264.Usage.UltraLowLatency="Très faible latence"
-AMF.H264.Usage.LowLatency="Faible latence"
-AMF.H264.QualityPreset="Préréglages de qualité"
-AMF.H264.QualityPreset.Description="Le préréglage de qualité qu'AMF doit cibler :\n- \"\@AMF.H264.QualityPreset.Speed\@\" est le plus rapide, au détriment d'une qualité déplorable,\n- \"\@AMF.H264.QualityPreset.Balanced\@\" est un compromis entre \"\@AMF.H264.QualityPreset.Speed\@\" et \"\@AMF.H264.QualityPreset.Quality\@\",\n- \"\@AMF.H264.QualityPreset.Quality\@\" délivre la meilleure qualité pour un débit donné."
-AMF.H264.QualityPreset.Speed="Vitesse"
-AMF.H264.QualityPreset.Balanced="Equilibré"
-AMF.H264.QualityPreset.Quality="Qualité"
-AMF.H264.Profile="Profil"
-AMF.H264.Profile.Description="Le profil H.264 utilisé pour l'encodage :\n- 'Baseline' est compatible avec la majorité des lecteurs,\n-'Main' est compatible avec les lecteurs d'ancienne génération (recommandé pour les lecteurs mobiles),\n- 'High' est compatible avec la plupart des lecteurs récents (réglage recommandé pour la plupart des cas)."
-AMF.H264.ProfileLevel="Niveau de profil"
-AMF.H264.ProfileLevel.Description="Quel niveau de profil H.264 utiliser pour l'encodage :\n- \"Automatique\" laisse l'encodeur déterminer le meilleur profil en fonction de la résolution d'image et du nombre d'images par seconde.\n- \"4.1\" supporte les formats suivants : 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- \"4.2\" supporte les formats suivants : 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- \"5.0\" supporte les formats suivants : 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- \"5.1\" supporte les formats suivants : 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- \"5.2\" supporte les formats suivants : 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Méthode de contrôle du débit"
-AMF.H264.RateControlMethod.CQP="QP constant (CQP)"
-AMF.H264.RateControlMethod.CBR="Débit constant (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Débit Variable (maximum) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Débit Variable (latence limitée) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Débit cible"
-AMF.H264.Bitrate.Target.Description="Le débit de données sortantes que l'encodeur va essayer de respecter pendant l'encodage."
-AMF.H264.Bitrate.Peak="Débit maximal"
-AMF.H264.Bitrate.Peak.Description="Le débit de données sortantes que l'encodeur va essayer de respecter pendant l'encodage."
-AMF.H264.QP.Minimum="QP minimal"
-AMF.H264.QP.Maximum="QP maximal"
-AMF.H264.QP.IFrame="QP I-Frame"
-AMF.H264.QP.PFrame="QP P-Frame"
-AMF.H264.QP.BFrame="QP B-Frame"
-AMF.H264.VBVBuffer="Tampon VBV"
-AMF.H264.VBVBuffer.Description="Quelle méthode utiliser pour déterminer la taille du tampon VBV :\n- \"\@AMF.Util.Automatic\@\" calcule la taille en fonction de la valeur de respect du tampon,\n- \"\@AMF.Util.Manual\@\" laisse le choix de la taille à l'utilisateur.\nLe tampon VBV (Video Buffering Verifier) est utilisé par certaines méthodes de contrôle du débit pour assurer au mieux le respect des contraintes données."
-AMF.H264.VBVBuffer.Strictness="Respect du tampon VBV"
-AMF.H264.VBVBuffer.Size="Taille du tampon VBV"
-AMF.H264.VBVBuffer.Fullness="Remplissage du tampon VBV"
-AMF.H264.FillerData="Données de remplissage"
-AMF.H264.FrameSkipping="Saut d'images"
-AMF.H264.EnforceHRDCompatibility="Appliquer la compatibilité avec l'HRD"
-AMF.H264.KeyframeInterval="Intervalle d'images-clé"
-AMF.H264.IDRPeriod="Périodicité des trames IDR"
-AMF.H264.DeblockingFilter="Filtre de dégroupage"
-AMF.H264.ScanType="Balayage"
-AMF.H264.ScanType.Description="La méthode de balayage à utiliser (laissez cette valeur sur \"\@AMF.H264.ScanType.Progressive\@\")."
-AMF.H264.ScanType.Progressive="Progressif"
-AMF.H264.ScanType.Interlaced="Entrelacé"
-AMF.H264.MotionEstimation="Estimation de mouvement"
-AMF.H264.MotionEstimation.Description="L'estimation du mouvement permet à l'encodeur de réduire le débit en calculant le déplacement des pixels."
-AMF.H264.MotionEstimation.None="Aucun"
-AMF.H264.MotionEstimation.Half="Demi-pixel"
-AMF.H264.MotionEstimation.Quarter="Quart de pixel"
-AMF.H264.MotionEstimation.Both="Demi-pixel & quart de pixel"
-AMF.H264.CodingType="Type de codage"
-AMF.H264.CodingType.Description="Le type de codage à utiliser:\n* \"\@AMF.Util.Default\@\" : laisser AMF décider (recommandé).\n* CALVC (Context-Adaptive Variable-Length Coding) est rapide mais lourd.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) est lent mais léger."
+Utility.Default="Défaut"
+Utility.Automatic="Automatique"
+Utility.Manual="Manuel"
+Utility.Switch.Disabled="Désactivé"
+Utility.Switch.Enabled="Activé"
+Preset="Pré-réglage"
+Preset.ResetToDefaults="Valeurs par défaut"
+Preset.Recording="Enregistrement"
+Preset.HighQuality="Qualité élevée"
+Preset.Indistinguishable="Indifférenciable"
+Preset.Lossless="Sans pertes"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Usage"
+Usage.Transcoding="Transcodage"
+Usage.UltraLowLatency="Très faible latence"
+Usage.LowLatency="Faible latence"
+Usage.Webcam="Webcam"
+QualityPreset="Préréglages de qualité"
+QualityPreset.Speed="Vitesse"
+QualityPreset.Balanced="Equilibré"
+QualityPreset.Quality="Qualité"
+Profile="Profil"
+ProfileLevel="Niveau de profil"
+Tier="Palier"
+AspectRatio="Ratio de l'image"
+MotionEstimation.Half="Demi-pixel"
+Video.API="API vidéo"
+Video.Adapter="Périphérique vidéo"
+View.Basic="Basique"
+View.Advanced="Avancé"
+View.Expert="Expert"
+View.Master="Master"
+Debug="Débogage"
AMF.H264.MaximumLTRFrames="Maximum de trames LTR"
AMF.H264.MaximumAccessUnitSize="Taille max. d'une Access Unit"
AMF.H264.MaximumAccessUnitSize.Description="Taille maximale d’une unité d’accès pour un NAL. Une valeur de 0 permet à l’encodeur de choisir le meilleur."
AMF.H264.HeaderInsertionSpacing="Intervalle d'insertion de l'en-tête de stream"
AMF.H264.SlicesPerFrame="Tranches par image"
-AMF.H264.VideoAPI="API vidéo"
-AMF.H264.VideoAdapter="Périphérique vidéo"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.View="Mode de visualisation"
-AMF.H264.View.Description="Quels paramètres afficher ?\nChoisir '\@AMF.H264.View.Master\@' est réservé aux utilisateurs avancés, et vous exclus d'office de toute possibilité d'assistance de la part du développeur."
-AMF.H264.View.Basic="Basique"
-AMF.H264.View.Advanced="Avancé"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Maître"
-AMF.H264.Debug="Débogage"
-AMF.H264.Debug.Description="Activer le débogage avancé dans le fichier journal (dans le cas où vous souhaitez solliciter une assistance auprès du développeur de l'encodeur)."
+AMF.H264.SliceMode.Description="Inconnu, expérimental"
+AMF.H264.MaximumSliceSize.Description="Inconnu, expérimental"
+AMF.H264.SliceControlMode.Description="Inconnu, expérimental"
+AMF.H264.IntraRefresh.NumberOfStripes.Description="Inconnu, expérimental"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/hu-HU.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/hu-HU.ini
Changed
-AMF.Util.Default="Alapértelmezett"
-AMF.Util.Automatic="Automatikus"
-AMF.Util.Manual="Manuális"
-AMF.Util.Toggle.Disabled="Letiltva"
-AMF.Util.Toggle.Enabled="Engedélyezve"
-AMF.H264.Preset="Készlet"
-AMF.H264.Preset.ResetToDefaults="Alapértelmezett beállítások visszaállítása"
-AMF.H264.Preset.Recording="Felvétel"
-AMF.H264.Preset.HighQuality="Kiváló minőség"
-AMF.H264.Preset.Indistinguishable="Megkülönböztethetetlen"
-AMF.H264.Preset.Lossless="Veszteségmentes"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Használat"
-AMF.H264.Usage.Transcoding="Transzkódolás"
-AMF.H264.Usage.UltraLowLatency="Nagyon alacsony késleltesésű"
-AMF.H264.Usage.LowLatency="Alacsony késleltetésű"
-AMF.H264.QualityPreset="Minőségi profil"
-AMF.H264.QualityPreset.Speed="Gyors"
-AMF.H264.QualityPreset.Balanced="Kiegyenlített"
-AMF.H264.QualityPreset.Quality="Minőségi"
-AMF.H264.Profile="Enkóder profil"
-AMF.H264.Profile.Description="Mely H.264 Profilt használja kódoláshoz:\n- 'Baseline' a legnagyobb platform támogatottsággal,\n- 'Main' a régebbi eszközök támogatják (ajánlott ha mobil eszköz tulajdonosokat céloz meg),\n- 'High' a jelenlegi eszközök támogatják (ajánlott)."
-AMF.H264.ProfileLevel="Profil szint"
-AMF.H264.RateControlMethod="Bitráta vezérlés"
-AMF.H264.RateControlMethod.CQP="Erőltetett QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Konstans bitsebesség (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Változó bitsebesség (Csúcsértéket betartva) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Változó bitsebesség (Késleltetés kényszerítése) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Célbitsebesség"
-AMF.H264.Bitrate.Target.Description="Bitráta amelyet szeretne elérni a konvertálás során."
-AMF.H264.Bitrate.Peak="Csúcs bitsebesség"
-AMF.H264.Bitrate.Peak.Description="Bitsebesség amelyet maximálisan elérhet a konvertálás során."
-AMF.H264.QP.Minimum="Minimum QP"
-AMF.H264.QP.Minimum.Description="Legalacsonyabb QP (Quantization Parameter) képkockában használható érték."
-AMF.H264.QP.Maximum="Maximum QP"
-AMF.H264.QP.Maximum.Description="Legmagasabb QP (Quantization Parameter) képkockában használható érték."
-AMF.H264.QP.IFrame="I-Képkocka QP"
-AMF.H264.QP.IFrame.Description="Rögzített QP érték az I képkockák használatához."
-AMF.H264.QP.PFrame="P-Képkocka QP"
-AMF.H264.QP.PFrame.Description="Rögzített QP érték a P képkockák használatához."
-AMF.H264.QP.BFrame="B-Képkocka QP"
-AMF.H264.QP.BFrame.Description="Rögzített QP (Quantization Parameter) érték a B-Képkockák használatához."
-AMF.H264.VBVBuffer="VBV puffer"
-AMF.H264.VBVBuffer.Strictness="VBV Puffer kötöttség"
-AMF.H264.VBVBuffer.Strictness.Description="Meghatározza a VBV puffer szigorúságát, 100% esetén teljesen pontos és 0% esetén kötetlen."
-AMF.H264.VBVBuffer.Size="VBV pufferméret"
-AMF.H264.VBVBuffer.Fullness="VBV puffer telítettség"
-AMF.H264.FillerData="Filler adat"
-AMF.H264.FrameSkipping="Képkocka kihagyás"
-AMF.H264.EnforceHRDCompatibility="HRD Kompatibilitás kényszerítése"
-AMF.H264.KeyframeInterval="Kulcsképkocka időköze"
-AMF.H264.IDRPeriod="IDR időköz"
-AMF.H264.BFrame.Pattern="B képkocka"
-AMF.H264.BFrame.Pattern.Description="A B képkockák száma kódolás közben.\nA második és harmadik generációjú VCE kártyák támogatják. Hátráltató hatása van a kódolás teljesítményére."
-AMF.H264.BFrame.DeltaQP="B képkocka Delta QP"
-AMF.H264.BFrame.DeltaQP.Description="Delta QP érték az utolsó I vagy P képkocka nem referenciálható B képkockáihoz."
-AMF.H264.BFrame.Reference="Referenciálható B képkockák"
-AMF.H264.BFrame.Reference.Description="Lehetővé teszi a B képkockák számára, hogy B képkockát referenciaként használjon, P és I képkockák helyett."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP a referenciálható B kockákhoz"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Delta QP érték az utolsó I vagy P képkockának a referenciálható B képkockákhoz."
-AMF.H264.DeblockingFilter="Deblocking Filter"
-AMF.H264.ScanType="Rögzítés módja"
-AMF.H264.ScanType.Progressive="Progresszív"
-AMF.H264.ScanType.Interlaced="Váltottsoros (Kísérleti)"
-AMF.H264.MotionEstimation="Mozgásbecslés"
-AMF.H264.MotionEstimation.Description="Mozdulat becslés lehetővé teszi a kódolónak, hogy csökkentse a bitsebesség követelményt a pixel elmozdulásának a felbecsülésével."
-AMF.H264.MotionEstimation.None="Semmi"
-AMF.H264.MotionEstimation.Half="Fél-pixel"
-AMF.H264.MotionEstimation.Quarter="Negyed-pixel"
-AMF.H264.MotionEstimation.Both="Fél-&negyed-pixel"
-AMF.H264.CodingType="Kódolás típusa"
+Utility.Default="Alapértelmezett"
+Utility.Automatic="Automatikus"
+Utility.Manual="Manuális"
+Utility.Switch.Disabled="Letiltva"
+Utility.Switch.Enabled="Engedélyezve"
+Preset="Sablon"
+Preset.ResetToDefaults="Alapértelmezett beállítások visszaállítása"
+Preset.Recording="Felvétel"
+Preset.HighQuality="Jó minőség"
+Preset.Indistinguishable="Megkülönböztethetetlen"
+Preset.Lossless="Veszteségmentes"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Használat"
+Usage.Transcoding="Átkódolás"
+Usage.UltraLowLatency="Nagyon alacsony késleltetés"
+Usage.LowLatency="Alacsony késleltetés"
+Usage.Webcam="Webkamera"
+QualityPreset="Minőségi profil"
+QualityPreset.Speed="Sebesség"
+QualityPreset.Balanced="Kiegyensúlyozott"
+QualityPreset.Quality="Minőségi"
+Profile="Profil"
+ProfileLevel="Profil szint"
+Tier="Szint"
+AspectRatio="Képarány"
+CodingType="Kódolás típusa"
+MaximumReferenceFrames="Maximális referencia képkockák"
+RateControlMethod="Sebességvezérlési metódus"
+RateControlMethod.CQP="Állandó QP (CQP)"
+RateControlMethod.CBR="Konstans bitsebesség (CBR)"
+RateControlMethod.VBR="Változó bitsebesség (Csúcsértéket betartva) (VBR)"
+RateControlMethod.VBRLAT="Változó bitsebesség (Késleltetés kényszerítése) (VBRLAT)"
+Bitrate.Target="Célbitsebesség"
+Bitrate.Target.Description="Bitsebesség, amelyet megkísérel elérni a szekvencia során."
+Bitrate.Peak="Csúcs bitsebesség"
+Bitrate.Peak.Description="Bitsebesség amelyet maximálisan elérhet a szekvencia során."
+QP.IFrame="I-Képkocka QP"
+QP.IFrame.Description="Rögzített QP érték az I képkockák használatához."
+QP.PFrame="P-Képkocka QP"
+QP.PFrame.Description="Rögzített QP érték a P képkockák használatához."
+QP.BFrame="B-Képkocka QP"
+QP.BFrame.Description="Rögzített QP érték az B képkockák használatához."
+QP.Minimum="Minimum QP"
+QP.Minimum.Description="Képkockában legalacsonyabb használható QP érték."
+QP.IFrame.Minimum="Minimális I-Képkocka QP"
+QP.IFrame.Minimum.Description="I-Képkockában legalacsonyabb használható QP érték."
+QP.PFrame.Minimum="Minimális I-Képkocka QP"
+QP.PFrame.Minimum.Description="P-Képkockában legalacsonyabb használható QP érték."
+QP.Maximum="Maximum QP"
+QP.Maximum.Description="Képkockában legnagyobb használható QP érték."
+QP.IFrame.Maximum="Maximális I-Képkocka QP"
+QP.IFrame.Maximum.Description="I-Képkockában legnagyobb használható QP érték."
+QP.PFrame.Maximum="Maximális P-Képkocka QP"
+QP.PFrame.Maximum.Description="P-Képkockában legnagyobb használható QP érték."
+FillerData="Kitöltőadat"
+FrameSkipping="Képkocka kihagyás"
+VBAQ="VBAQ"
+EnforceHRD="HRD kényszerítése"
+VBVBuffer="VBV puffer"
+VBVBuffer.Strictness="VBV Puffer kötöttség"
+VBVBuffer.Strictness.Description="Meghatározza a VBV puffer szigorúságát, 100% esetén teljesen pontos és 0% esetén kötetlen."
+VBVBuffer.Size="VBV pufferméret"
+VBVBuffer.Size.Description="A VBV puffer mérete, amely a bitsebesség vezérléshez használatos a szekvenciában."
+VBVBuffer.InitialFullness="VBV puffer kezdeti telítettség"
+KeyframeInterval="Kulcsképkocka időköze"
+KeyframeInterval.Description="A kulcsképkockák közötti időköz (másodpercben)."
+H264.IDRPeriod="IDR idő (Képkockákban)"
+H264.IDRPeriod.Description="Meghatározza a távolságot két pillanatnyi dekódoló frissítő (IDR) között a képkockákban. A GOP szekvencia méretét is szabályozza."
+H265.IDRPeriod="IDR idő (GOP-okban)"
+GOP.Type="GOP típusa"
+GOP.Type.Fixed="Rögzített"
+GOP.Type.Variable="Változó"
+GOP.Size="GOP méret"
+GOP.Size.Minimum="GOP méretminimum"
+GOP.Size.Minimum.Description="Minimális GOP (képek csoportja) méret keretekben."
+GOP.Size.Maximum="GOP méretmaximum"
+GOP.Size.Maximum.Description="Maximális GOP (képek csoportja) méret keretekben."
+GOP.Alignment="GOP igazítás"
+GOP.Alignment.Description="Kísérleti, hatása ismeretlen. Csak saját felelősségére használja."
+BFrame.Pattern="B-képkockaminta"
+BFrame.Pattern.Description="A B képkockák száma kódolás közben.\nA második és harmadik generációjú VCE kártyák támogatják. Hátráltató hatása van a kódolás teljesítményére."
+BFrame.DeltaQP="B-képkocka Delta QP"
+BFrame.DeltaQP.Description="Delta QP érték az utolsó I vagy P képkocka nem referenciálható B képkockáihoz."
+BFrame.Reference="Referencia B-Képkocka"
+BFrame.Reference.Description="Lehetővé teszi a B képkockák számára, hogy B képkockát referenciaként használjon, P és I képkockák helyett."
+BFrame.ReferenceDeltaQP="B-képkocka referencia Delta QP"
+BFrame.ReferenceDeltaQP.Description="Delta QP érték az utolsó I vagy P képkockának a referenciálható B képkockákhoz."
+DeblockingFilter="Deblocking szűrő"
+DeblockingFilter.Description="Lehetővé teszi a dekódernek, hogy Deblocking szűrőt alkalmazzon."
+MotionEstimation="Mozgásbecslés"
+MotionEstimation.Description="Mozdulat becslés lehetővé teszi a kódolónak, hogy csökkentse a bitsebesség követelményt a pixel elmozdulásának a felbecsülésével."
+MotionEstimation.Quarter="Negyedpixel"
+MotionEstimation.Half="Félpixel"
+MotionEstimation.Full="Negyed és félpixel"
+Video.API="Videó API"
+Video.Adapter="Videó adapter"
+Video.Adapter.Description="Milyen Adapteren történjen a kódolás?"
+OpenCL="OpenCL"
+View="Nézet mód"
+View.Basic="Alap"
+View.Advanced="Haladó"
+View.Expert="Szakértő"
+View.Master="Mester"
+Debug="Hibakeresés"
AMF.H264.MaximumLTRFrames="Maximális LTR képkocka"
AMF.H264.MaximumAccessUnitSize="Hozzáférési egység maximális mérete"
AMF.H264.MaximumAccessUnitSize.Description="NAL számára a legnagyobb elérési egység."
+AMF.H264.HeaderInsertionSpacing="Fejléc beszúrási térköz"
AMF.H264.HeaderInsertionSpacing.Description="NAL fejlécek közötti képkockák száma."
AMF.H264.WaitForTask="Feladatra várakozás"
AMF.H264.WaitForTask.Description="Ismeretlen, kísérleti"
-AMF.H264.PreAnalysisPass="Elemzés előtti fázis"
-AMF.H264.PreAnalysisPass.Description="Ismeretlen, kísérleti"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Ismeretlen, kísérleti"
-AMF.H264.GOPSize="GOP méret"
-AMF.H264.GOPSize.Description="Ismeretlen, kísérleti"
-AMF.H264.GOPAlignment="GOP igazítás"
-AMF.H264.GOPAlignment.Description="Ismeretlen, kísérleti"
-AMF.H264.MaximumReferenceFrames="Maximális referencia képkockák"
-AMF.H264.MaximumReferenceFrames.Description="Ismeretlen, kísérleti"
AMF.H264.SlicesPerFrame="Szeletelés képkockánként"
+AMF.H264.SlicesPerFrame.Description="Mennyi I-Képkocka szelet tárolható egyes képkockán?\nA zéró érték hagyja a kódolót menet közben dönteni.\nAz Intra-frissítési kódolás a gyorsabb lejátszáshoz és kereséshez használható."
AMF.H264.SliceMode="Szelet mód"
AMF.H264.SliceMode.Description="Ismeretlen, kísérleti"
AMF.H264.MaximumSliceSize="Maximális szeletméret"
AMF.H264.SliceControlSize.Description="Ismeretlen, kísérleti"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Ismeretlen, kísérleti"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Hány makroblokkot tároljon céllánként?\nA 0 érték kikapcsolja a funkciót.\nIntra-Refresh kódolás a gyorsabb lejátszáshoz és kereséshez használható."
-AMF.H264.VideoAPI="Videó API"
-AMF.H264.VideoAPI.Description="Melyik API használható kódoláshoz."
-AMF.H264.VideoAdapter="Videó adapter"
-AMF.H264.VideoAdapter.Description="Melyik adapter használható kódoláshoz."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="A kódolós használhat OpenCL-t az egyes képkockák elkészítéséhez?"
-AMF.H264.View="Nézet mód"
-AMF.H264.View.Description="Milyen tulajdonságok jelenjenek meg?\nA '\@AMF.H264.View.Master\@' használata kizárja a támogatásban való részesülésből."
-AMF.H264.View.Basic="Alap"
-AMF.H264.View.Advanced="Haladó"
-AMF.H264.View.Expert="Szakértő"
-AMF.H264.View.Master="Mester"
-AMF.H264.Debug="Hibakeresés"
-AMF.H264.Debug.Description="További hibakeresési naplózás engedélyezése, aktiválja amennyiben tanácsadásra van szüksége a kódolóval kapcsolatban."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/it-IT.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/it-IT.ini
Changed
-AMF.Util.Default="Predefinito"
-AMF.Util.Automatic="Automatico"
-AMF.Util.Manual="Manuale"
-AMF.Util.Toggle.Disabled="Disabilitato"
-AMF.Util.Toggle.Enabled="Attivo"
-AMF.H264.Preset="Preset"
-AMF.H264.Preset.ResetToDefaults="Ripristina a Predefiniti"
-AMF.H264.Preset.Recording="Registrazione"
-AMF.H264.Preset.HighQuality="Alta Qualità"
-AMF.H264.Preset.Indistinguishable="Indistinguibili"
-AMF.H264.Preset.Lossless="Lossless"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Utilizzo"
-AMF.H264.Usage.Transcoding="Transcodifica"
-AMF.H264.Usage.UltraLowLatency="Latenza ultra bassa"
-AMF.H264.Usage.LowLatency="Bassa latenza"
-AMF.H264.QualityPreset="Qualità Preset"
-AMF.H264.QualityPreset.Speed="Velocità"
-AMF.H264.QualityPreset.Balanced="Bilanciato"
-AMF.H264.QualityPreset.Quality="Qualità"
-AMF.H264.Profile="Profilo"
-AMF.H264.ProfileLevel="Livello profilo"
-AMF.H264.ProfileLevel.Description="Quale livello di profilo H.264 usare per l'encoding:\n- 'Automatico' calcola il miglior livello per il dato Frame Rate e Frame Size,\n- '4.1' supporta 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' supporta 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' supporta 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' supporta 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' supporta 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Metodo di controllo della frequenza"
-AMF.H264.RateControlMethod.CQP="QP Costante (QPC)"
-AMF.H264.RateControlMethod.CBR="Bitrate costante (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Bitrate Variabile (Peak Constrained) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Bitrate variabile (latenza vincolata) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Bitrate di destinazione"
-AMF.H264.Bitrate.Peak="Picco Bitrate"
-AMF.H264.QP.Minimum="QP Minimo"
-AMF.H264.QP.Minimum.Description="Valore QP più basso da utilizzare in un Frame."
-AMF.H264.QP.Maximum="QP Massimo"
-AMF.H264.QP.Maximum.Description="Valore QP più alto da utilizzare in un Frame."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.VBVBuffer="Buffer VBV"
-AMF.H264.VBVBuffer.Strictness="Strettezza Buffer VBV"
-AMF.H264.VBVBuffer.Size="Dimensione Buffer VBV"
-AMF.H264.VBVBuffer.Fullness="Larghezza Buffer VBV"
-AMF.H264.FillerData="Dati di riempimento"
-AMF.H264.FrameSkipping="Frame Skipping"
-AMF.H264.EnforceHRDCompatibility="Forza compatibilità HRD"
-AMF.H264.KeyframeInterval="Intervallo Keyframe"
-AMF.H264.IDRPeriod="Periodo IDR"
-AMF.H264.DeblockingFilter="Filtro di deblock"
-AMF.H264.ScanType="Tipo di scansione"
-AMF.H264.ScanType.Progressive="Progressivo"
-AMF.H264.ScanType.Interlaced="Interlacciato"
-AMF.H264.MotionEstimation="Stima Movimento"
-AMF.H264.MotionEstimation.None="Nessuno"
-AMF.H264.MotionEstimation.Half="Metà-Pixel"
-AMF.H264.MotionEstimation.Quarter="Quarto-Pixel"
-AMF.H264.MotionEstimation.Both="Meta- & Quarto-Pixel"
-AMF.H264.CodingType="Tipo di codifica"
AMF.H264.MaximumLTRFrames="Fotogrammi LTR Massimi"
AMF.H264.MaximumAccessUnitSize="Massima dimensione di unità d'accesso"
AMF.H264.HeaderInsertionSpacing="Spaziatura di inserimento di intestazione"
AMF.H264.WaitForTask="Attendere per attività"
AMF.H264.WaitForTask.Description="Sconosciuto, sperimentale"
-AMF.H264.PreAnalysisPass="Analisi pre-pass"
-AMF.H264.PreAnalysisPass.Description="Sconosciuto, sperimentale"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Sconosciuto, sperimentale"
-AMF.H264.GOPSize="Dimensione del GOP"
-AMF.H264.GOPSize.Description="Sconosciuto, sperimentale"
-AMF.H264.GOPAlignment="Allineamento del GOP"
-AMF.H264.GOPAlignment.Description="Sconosciuto, sperimentale"
-AMF.H264.MaximumReferenceFrames="Numero massimo di frames di riferimento"
-AMF.H264.MaximumReferenceFrames.Description="Sconosciuto, sperimentale"
AMF.H264.SlicesPerFrame="Slices Per Frame"
AMF.H264.SliceMode="Modalità Slice"
AMF.H264.SliceMode.Description="Sconosciuto, sperimentale"
AMF.H264.MaximumSliceSize="Dimensione massima frame"
AMF.H264.MaximumSliceSize.Description="Sconosciuto, sperimentale"
AMF.H264.SliceControlMode="Modalità di controllo Slice"
-AMF.H264.View="Modalità di visualizzazione"
-AMF.H264.View.Basic="Basico"
-AMF.H264.View.Advanced="Avanzate"
-AMF.H264.View.Expert="Esperto"
-AMF.H264.View.Master="Master"
-AMF.H264.Debug="Debug"
+AMF.H264.SliceControlMode.Description="Sconosciuto, sperimentale"
+AMF.H264.SliceControlSize.Description="Sconosciuto, sperimentale"
+AMF.H264.IntraRefresh.NumberOfStripes.Description="Sconosciuto, sperimentale"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/ja-JP.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/ja-JP.ini
Changed
-AMF.Util.Default="既定値"
-AMF.Util.Automatic="自動"
-AMF.Util.Manual="手動"
-AMF.Util.Toggle.Disabled="無効"
-AMF.Util.Toggle.Enabled="有効"
-AMF.H264.Preset="プリセット"
-AMF.H264.Preset.ResetToDefaults="既定値に戻す"
-AMF.H264.Preset.Recording="録画中"
-AMF.H264.Preset.HighQuality="高品質"
-AMF.H264.Preset.Indistinguishable="区別不能品質"
-AMF.H264.Preset.Lossless="無損失品質"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="用途"
-AMF.H264.Usage.Description="AMFの使用方法:\n- '\@AMF.H264.Usage.Transcoding\@' は汎用のトランスコード (推奨) で、\n- '\@AMF.H264.Usage.UltraLowLatency\@' は超低遅延エンコード用で、\n- '\@AMF.H264.Usage.LowLatency\@' は上記と似ていますが少し遅延があります。\n配信は '\@AMF.H264.Usage.Transcoding\@' のみをサポートし、他のすべての値は録画で使用できます。"
-AMF.H264.Usage.Transcoding="変換"
-AMF.H264.Usage.UltraLowLatency="超低遅延"
-AMF.H264.Usage.LowLatency="低遅延"
-AMF.H264.QualityPreset="品質プリセット"
-AMF.H264.QualityPreset.Description="どの品質プリセットをAMFが目標とするか:\n- '\@AMF.H264.QualityPreset.Speed\@' は最速ですが品質は最悪で、\n- '\@AMF.H264.QualityPreset.Balanced\@' は両方のバランスの取れた組み合わせで、\n- '\@AMF.H264.QualityPreset.Quality\@' は指定されたビットレートに対して最高の品質を提供します。"
-AMF.H264.QualityPreset.Speed="速度"
-AMF.H264.QualityPreset.Balanced="バランス"
-AMF.H264.QualityPreset.Quality="品質"
-AMF.H264.Profile="プロファイル"
-AMF.H264.Profile.Description="エンコードに使用するH.264プロファイル:\n- 'Baseline' はプラットフォームごとのサポートが最も大きく、\n- 'Main' は古いデバイスでもサポートされ、(モバイルデバイスがターゲットなら推奨)\n- 'High' は現在主流のデバイスでサポートされています。(推奨)"
-AMF.H264.ProfileLevel="プロファイルレベル"
-AMF.H264.ProfileLevel.Description="エンコードに使用する H.264 プロファイルレベル:\n- '自動' は与えられたフレームレートと解像度に対して最適なプロファイルレベルを計算し、\n- '4.1' は 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS をサポートし、\n- '4.2' は 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS をサポートし、\n- '5.0' は 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS をサポートし、\n- '5.1' は 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS をサポートし、\n- '5.2' は 3840x2160 60FPS、1920 x 1080 172FPS, 1280 x 720 172FPS, 960 x 540 172FPS をサポートします。"
-AMF.H264.RateControlMethod="レート制御方式"
-AMF.H264.RateControlMethod.Description="どのレート制御方法を使用すべきか:\n- '\@AMF.H264.RateControlMethod.CQP\@' は固定I-/P-/B-フレームQP (量子化パラメータ) の値を割り当て、\n- '\@AMF.H264.RateControlMethod.CBR\@' は指定された目標ビットレート (フィラーデータを使用) に留まり (配信に推奨)、\n- '\@AMF.H264.RateControlMethod.VBR\@' は指定されたピークビットレート以下にとどまり、\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' はGPU遅延と負荷が許可されている場合は目標ビットレートに近く、それ以外の場合はより高いビットレートを使用します (録画に推奨)。"
-AMF.H264.RateControlMethod.CQP="固定QP (CQP)"
-AMF.H264.RateControlMethod.CBR="固定ビットレート (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="可変ビットレート (ピーク制約) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="可変ビットレート (遅延制約) (VBR_LAT)"
-AMF.H264.Bitrate.Target="目標ビットレート"
-AMF.H264.Bitrate.Target.Description="全体的なシーケンスで達成しようとするビットレート。"
-AMF.H264.Bitrate.Peak="最大ビットレート"
-AMF.H264.Bitrate.Peak.Description="全体的なシーケンスでピークを最大にしようとするビットレート。"
-AMF.H264.QP.Minimum="最小QP"
-AMF.H264.QP.Minimum.Description="フレームで使用する最低 QP (量子化パラメーター) の値です。"
-AMF.H264.QP.Maximum="最大QP"
-AMF.H264.QP.Maximum.Description="フレームで使用する最高 QP (量子化パラメーター) の値です。"
-AMF.H264.QP.IFrame="I-フレーム QP"
-AMF.H264.QP.IFrame.Description="I-フレームに使用する固定 QP の値。"
-AMF.H264.QP.PFrame="P-フレーム QP"
-AMF.H264.QP.PFrame.Description="P-フレームに使用する固定 QP の値。"
-AMF.H264.QP.BFrame="B-フレーム QP"
-AMF.H264.QP.BFrame.Description="B-フレームに対して使用する固定 QP (量子化パラメーター) の値です。"
-AMF.H264.VBVBuffer="VBV バッファ"
-AMF.H264.VBVBuffer.Description="どの方法を使用してVBVバッファーサイズを決定する必要があるか:\n- '\@AMF.Util.Automatic\@' は厳密性制約を使用してサイズを計算し、\n- '\@AMF.Util.Manual\@' はユーザーがサイズを制御できるようにします。\nVBV (ビデオバッファリングベリファイア) バッファは特定のレート制御方法により指定された制約内で全体のビットレートを保持するために使用されます。"
-AMF.H264.VBVBuffer.Strictness="VBV バッファ厳密性"
-AMF.H264.VBVBuffer.Strictness.Description="VBV バッファーの厳密さを決定し、100%は可能な限り厳密で0%は制限されません。"
-AMF.H264.VBVBuffer.Size="VBV バッファサイズ"
-AMF.H264.VBVBuffer.Size.Description="シーケンスにおけるビットレート制御のために使用されている VBV バッファーのサイズ。"
-AMF.H264.VBVBuffer.Fullness="VBV バッファ充満"
-AMF.H264.VBVBuffer.Fullness.Description="VBVバッファーの初期状態は、エンコーディングの初期シーケンスにのみ影響します。"
-AMF.H264.FillerData="フィラーデータ"
-AMF.H264.FillerData.Description="フィラーデータを有効にするとエンコーダは空の情報でシーケンスの残りのスペースを埋めることによって少なくともターゲットビットレートを維持することができます。"
-AMF.H264.FrameSkipping="フレームスキップ"
-AMF.H264.FrameSkipping.Description="フレームスキッピングはエンコーダが目標ビットレート要件を満たすためにフレームをドロップすることを可能にする。\nエンコーダがフレームを落とすときに代わりにリピートラストフレームNALをストリームに挿入する。\n目標ビットレートが非常に低い場合に役立ちます。"
-AMF.H264.EnforceHRDCompatibility="HRD 互換性を強制"
-AMF.H264.EnforceHRDCompatibility.Description="フレーム内の最大QP値の変化を制限する仮説的参照デコーダの制限を強制します。\n録画や配信には非推奨で参照ソフトウェアデコーダのみを持つ非常に古いデバイスをターゲットにする場合にのみ使用してください。"
-AMF.H264.KeyframeInterval="キーフレーム間隔"
-AMF.H264.KeyframeInterval.Description="ドロップ不可能なフレーム間の秒数。\nGOPのサイズも制御します。"
-AMF.H264.IDRPeriod="IDR 周期"
-AMF.H264.IDRPeriod.Description="フレーム内の瞬時デコードリフレッシュ (IDR) 間の距離を定義します。 GOP-シーケンスのサイズも制御します。"
-AMF.H264.BFrame.Pattern="B-フレーム"
-AMF.H264.BFrame.Pattern.Description="エンコードに使用するBフレームの数。\n第2世代および第3世代のVCEカードでサポートされています。 エンコーディングのパフォーマンスに悪影響を与えます。"
-AMF.H264.BFrame.DeltaQP="B-フレーム デルタ QP"
-AMF.H264.BFrame.DeltaQP.Description="参照不可能なB-フレームに対する最後のI-フレームまたはP-フレームまでのデルタ QP の値。"
-AMF.H264.BFrame.Reference="参照可能 B-フレーム"
-AMF.H264.BFrame.Reference.Description="B-フレームはP-フレームとI-フレームだけでなく、B-フレームも参照として使用できます。"
-AMF.H264.BFrame.ReferenceDeltaQP="参照可能 B-フレーム デルタ QP"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="参照可能なB-フレームに対する最後のI-フレームまたはP-フレームまでのデルタ QP の値。"
-AMF.H264.DeblockingFilter="デブロックフィルタ"
-AMF.H264.DeblockingFilter.Description="デコーダがエンコードされたストリームに対してデブロックフィルタの使用を許可されているかのフラグを設定します。"
-AMF.H264.ScanType="スキャンの種類"
-AMF.H264.ScanType.Description="どのスキャン方法が使用されるか、'プログレッシブ'を常にこのままにしておきます。"
-AMF.H264.ScanType.Progressive="プログレッシブ"
-AMF.H264.ScanType.Interlaced="インターレース"
-AMF.H264.MotionEstimation="動き推定"
-AMF.H264.MotionEstimation.Description="動き推定はピクセルがどこに移動したかを推定することによってエンコーダが必要とするビットレートを削減します。"
-AMF.H264.MotionEstimation.None="未設定"
-AMF.H264.MotionEstimation.Half="1/2ピクセル"
-AMF.H264.MotionEstimation.Quarter="1/4ピクセル"
-AMF.H264.MotionEstimation.Both="ハーフ & クォーターピクセル"
-AMF.H264.CodingType="コーディングの種類"
-AMF.H264.CodingType.Description="使用するコーディングの種類:\n* \@AMF.Util.Default\@ AMFが決定します。(推奨)\n* CALVC (Context-Adaptive Variable-Length Coding) は高速ですが、容量は大きいです。\n* CABAC (Context-Adaptive Binary Arithmetic Coding) は低速ですが、容量は小さくなります。"
+Utility.Default="既定値"
+Utility.Automatic="自動"
+Utility.Manual="手動"
+Utility.Switch.Disabled="無効"
+Utility.Switch.Enabled="有効"
+Preset="プリセット"
+Preset.ResetToDefaults="既定値に戻す"
+Preset.Recording="録画中"
+Preset.HighQuality="高品質"
+Preset.Indistinguishable="区別不能品質"
+Preset.Lossless="無損失品質"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="用途"
+Usage.Description="AMFの使用方法:\n- '\@Usage.Transcoding\@' は汎用のトランスコード (推奨) で、\n- '\@Usage.UltraLowLatency\@' は超低遅延エンコード用で、\n- '\@Usage.LowLatency\@' は上記と似ていますが少し遅延があります。\n配信は '\@Usage.Transcoding\@' のみをサポートし、他のすべての値は録画で使用できます。"
+Usage.Transcoding="変換"
+Usage.UltraLowLatency="超低遅延"
+Usage.LowLatency="低遅延"
+Usage.Webcam="ウェブカメラ"
+QualityPreset="品質プリセット"
+QualityPreset.Description="どの品質プリセットをAMFが目標とするか:\n- '\@QualityPreset.Speed\@' は最速ですが品質は最悪で、\n- '\@QualityPreset.Balanced\@' は両方のバランスの取れた組み合わせで、\n- '\@QualityPreset.Quality\@' は指定されたビットレートに対して最高の品質を提供します。"
+QualityPreset.Speed="速度"
+QualityPreset.Balanced="バランス"
+QualityPreset.Quality="品質"
+Profile="プロファイル"
+Profile.Description="エンコードに使用するプロファイル。最もサポートが広いもの (上) から最も品質が高いもの (下) まで。"
+ProfileLevel="プロファイルレベル"
+ProfileLevel.Description="エンコードに使用するプロファイルレベル。\@Utility.Automatic\@ のままにするのが最善です。"
+Tier="階層"
+Tier.Description="どの階層でエンコードするか。'High' は高いビットレート/帯域幅の利用を対象とし、'Main' は主流メディアを対象とします。"
+AspectRatio="アスペクト比"
+AspectRatio.Description="どのアスペクト比が出力ファイルに書き込まれるべきか。"
+CodingType="コーディングの種類"
+CodingType.Description="使用するコーディングの種類:\n* \@Utility.Automatic\@ AMFが決定します。(推奨)\n* CALVC (Context-Adaptive Variable-Length Coding) は高速ですが、容量は大きいです。\n* CABAC (Context-Adaptive Binary Arithmetic Coding) は低速ですが、容量は小さくなります。"
+MaximumReferenceFrames="最大参照フレーム数"
+MaximumReferenceFrames.Description="エンコード時にエンコーダが参照できるフレームの最大数は、エンコーディングの質に直接影響します。"
+RateControlMethod="レート制御方式"
+RateControlMethod.Description="どのレート制御方法を使用すべきか:\n- '\@RateControlMethod.CQP\@' は固定I-/P-/B-フレーム QP の値を割り当て、\n- '\@RateControlMethod.CBR\@' は指定された目標ビットレート (フィラーデータを使用) に留まり (配信に推奨)、\n- '\@RateControlMethod.VBR\@' は指定されたピークビットレート以下にとどまり、\n- '\@RateControlMethod.VBRLAT\@' はGPU遅延と負荷が許可されている場合は目標ビットレートに近く、それ以外の場合はより高いビットレートを使用します (録画に推奨)。"
+RateControlMethod.CQP="固定QP (CQP)"
+RateControlMethod.CBR="固定ビットレート (CBR)"
+RateControlMethod.VBR="可変ビットレート (ピーク制約) (VBR)"
+RateControlMethod.VBRLAT="可変ビットレート (遅延制約) (VBRLAT)"
+PrePassMode="プレパスモード"
+PrePassMode.Description="プレパスはシーケンス内のビットレートのより良い配分を可能にする二次的なビットレート配分パスですが、この効果はカードによって異なる場合があります。"
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (4分の1サイズ)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (2分の1サイズ)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (フルサイズ)"
+Bitrate.Target="目標ビットレート"
+Bitrate.Target.Description="全体的なシーケンスで達成しようとするビットレート。"
+Bitrate.Peak="最大ビットレート"
+Bitrate.Peak.Description="全体的なシーケンスでピークを最大にしようとするビットレート。"
+QP.IFrame="I-フレーム QP"
+QP.IFrame.Description="I-フレームに使用する固定 QP の値。"
+QP.PFrame="P-フレーム QP"
+QP.PFrame.Description="P-フレームに使用する固定 QP の値。"
+QP.BFrame="B-フレーム QP"
+QP.BFrame.Description="B-フレームに使用する固定 QP の値。"
+QP.Minimum="最小QP"
+QP.Minimum.Description="フレームで使用する最低の QP 値。"
+QP.IFrame.Minimum="最小 I-フレーム QP"
+QP.IFrame.Minimum.Description="I-フレームで使用する QP の最低値。"
+QP.PFrame.Minimum="最小 P-フレーム QP"
+QP.PFrame.Minimum.Description="P-フレームで使用する QP の最低値。"
+QP.Maximum="最大QP"
+QP.Maximum.Description="フレームで使用する最高の QP 値。"
+QP.IFrame.Maximum="最大 I-フレーム QP"
+QP.IFrame.Maximum.Description="I-フレームで使用する QP の最高値。"
+QP.PFrame.Maximum="最大 P-フレーム QP"
+QP.PFrame.Maximum.Description="P-フレームで使用する QP の最高値。"
+FillerData="フィラーデータ"
+FillerData.Description="フィラーデータを有効にするとエンコーダは空の情報で残りのスペースを埋めることによって少なくとも \@Bitrate.Target\@ を維持することができます。"
+FrameSkipping="フレームスキップ"
+FrameSkipping.Description="フレームスキッピングはエンコーダが \@Bitrate.Target\@ 要件を満たすためにフレームをドロップすることを可能にします。\nエンコーダがフレームを落とすときに代わりにリピートラストフレームNALをストリームに挿入します。\n \@Bitrate.Target\@ 要件が非常に低い場合に役立ちます。"
+VBAQ="VBAQ"
+VBAQ.Description="ピクセルの分散に基づいてビットレートをより良く分配する '分散に基づく適応型量子化' (VBAQ) の使用を有効にします。\n人間の視覚系が高度なテクスチャ領域のアーチファクトに対して敏感でないという考え方に基づきビットレート配分をより滑らかな表面に向けることができます。\nこれを有効にすると特定のコンテンツの主観的品質が向上する可能性があります。"
+EnforceHRD="HRD を強制"
+EnforceHRD.Description="出力ビットストリームが正確であることを確認するために使用される仮説的参照デコーダの使用を強制します。"
+VBVBuffer="VBV バッファ"
+VBVBuffer.Description="VBV バッファサイズの決定方法:\n- '\@Utlity.Automatic\@' は厳密性制約を使用してサイズを計算し、\n- '\@Utlity.Manual\@' は利用者がサイズを制御できるようにします。\nVBV バッファ (ビデオバッファリングベリファイア) は特定のレート制御方法により指定された制約内で全体のビットレートを保持するために使用されます。"
+VBVBuffer.Strictness="VBV バッファ厳密性"
+VBVBuffer.Strictness.Description="VBV バッファーの厳密さを決定し、100%は可能な限り厳密で0%は制限されません。"
+VBVBuffer.Size="VBV バッファサイズ"
+VBVBuffer.Size.Description="シーケンスにおけるビットレート制御のために使用されている VBV バッファーのサイズ。"
+VBVBuffer.InitialFullness="VBV バッファ初期充満"
+VBVBuffer.InitialFullness.Description="VBV バッファの初期充足率 (%) がどの程度か、エンコードの初期シーケンスにのみ影響します。"
+KeyframeInterval="キーフレーム間隔"
+KeyframeInterval.Description="キーフレーム同士の間隔(秒単位)。"
+H264.IDRPeriod="IDR 周期 (フレーム数)"
+H264.IDRPeriod.Description="フレーム内の瞬時デコードリフレッシュ (IDR) 間の距離を定義します。 GOP-シーケンスのサイズも制御します。"
+H265.IDRPeriod="IDR 周期 (GOP数)"
+H265.IDRPeriod.Description="GOP内の瞬時デコードリフレッシュ (IDR) 間の距離を定義します。"
+GOP.Type="GOP 型"
+GOP.Type.Description="使用する GOP 型:\n - '\@GOP.Type.Fixed\@' 常にそれぞれの GOP 間の固定距離を使用。\n '\@GOP.Type.Variable\@' 必要に応じて、さまざまなサイズの Gop が使用されます。\n'\@GOP.Type.Fixed\@' はH264の実装方法次第でローカルネットワーク配信に最適で、一方 ' \@GOP.Type.Variable\@' はサイズ小さいの高品質の録画に最適です。"
+GOP.Type.Fixed="固定"
+GOP.Type.Variable="可変"
+GOP.Size="GOP サイズ"
+GOP.Size.Description="フレームの GOP (画像グループ) のサイズ。"
+GOP.Size.Minimum="GOP サイズの最小値"
+GOP.Size.Minimum.Description="フレームの GOP (画像グループ) の最小サイズ。"
+GOP.Size.Maximum="GOP サイズの最大値"
+GOP.Size.Maximum.Description="フレームの GOP (画像グループ) の最大サイズ。"
+GOP.Alignment="GOP 配置"
+GOP.Alignment.Description="実験的で、効果は不明です。自己責任で使用してください。"
+BFrame.Pattern="B-フレームパターン"
+BFrame.Pattern.Description="エンコード時に使用するBフレームの数。\n第2世代および第3世代のVCEカードでサポートされています。 エンコード時のパフォーマンスに悪影響を与えます。"
+BFrame.DeltaQP="B-フレームデルタ QP"
+BFrame.DeltaQP.Description="参照不可能なB-フレームに対する最後のI-フレームまたはP-フレームまでのデルタ QP の値。"
+BFrame.Reference="B-フレーム参照"
+BFrame.Reference.Description="B-フレームはP-フレームとI-フレームだけでなく、B-フレームも参照として使用できます。"
+BFrame.ReferenceDeltaQP="B-フレーム参照デルタ QP"
+BFrame.ReferenceDeltaQP.Description="参照可能なB-フレームに対する最後のI-フレームまたはP-フレームまでのデルタ QP の値。"
+DeblockingFilter="デブロックフィルタ"
+DeblockingFilter.Description="デコーダにデブロックフィルタの適用を許可します。"
+MotionEstimation="動き推定"
+MotionEstimation.Description="動き推定はピクセルがどこに移動したかを推定することによってエンコーダが必要とするビットレートを削減します。"
+MotionEstimation.Quarter="1/4ピクセル"
+MotionEstimation.Half="1/2ピクセル"
+MotionEstimation.Full="ハーフ & クォーターピクセル"
+Video.API="映像 API"
+Video.API.Description="バックエンドはどの API を使用すべきか?"
+Video.Adapter="ビデオアダプター"
+Video.Adapter.Description="どのアダプターでエンコードを試みるか?"
+OpenCL="OpenCL"
+OpenCL.Description="OpenCL をフレームの送信に使用する必要がありますか? 技術的に高速ですが、インテルのドライバに問題が発生します。 (互換性のない OpenCL ライブラリのため)"
+View="表示モード"
+View.Description="どのプロパティを表示する必要がありますか?\n '\@View.Master\@' の使用はサポートを受けることができなくなります。"
+View.Basic="基本"
+View.Advanced="詳細設定"
+View.Expert="エキスパート"
+View.Master="マスター"
+Debug="デバッグ"
+Debug.Description="追加のデバッグメッセージを有効にします。コマンドライン '--verbose --log_unfiltered' ('は削除) をつけてOBS Studio を実行することが必要です。"
AMF.H264.MaximumLTRFrames="最大 LTR フレーム"
AMF.H264.MaximumLTRFrames.Description="長期間参照 (LTR) フレームはエンコーダがシーケンス内の特定のフレームに長期間参照可能なフラグを立てる機能です。\nLTRフレームはB-ピクチャとの併用は不可でエンコーダはB-ピクチャが使用されている場合は無効にします。"
AMF.H264.MaximumAccessUnitSize="最大アクセスユニットサイズ"
AMF.H264.HeaderInsertionSpacing.Description="NALヘッダーの間にあるフレーム数。 これを0 (自動) から変更することはお勧めしません。"
AMF.H264.WaitForTask="タスクの待機"
AMF.H264.WaitForTask.Description="不明、実験的"
-AMF.H264.PreAnalysisPass="事前解析パス"
-AMF.H264.PreAnalysisPass.Description="不明、実験的"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="不明、実験的"
-AMF.H264.GOPSize="GOP サイズ"
-AMF.H264.GOPSize.Description="不明、実験的"
-AMF.H264.GOPAlignment="GOP 配置"
-AMF.H264.GOPAlignment.Description="不明、実験的"
-AMF.H264.MaximumReferenceFrames="最大参照フレーム"
-AMF.H264.MaximumReferenceFrames.Description="不明、実験的"
AMF.H264.SlicesPerFrame="フレームあたりのスライス"
AMF.H264.SlicesPerFrame.Description="各スロットにいくつのI-フレームスライスを格納する必要があるか?\nゼロの値を指定するとエンコーダは高速で決定します。\nイントラ-リフレッシュのエンコードは高速再生とシークに使用されます。"
AMF.H264.SliceMode="スライスモード"
AMF.H264.IntraRefresh.NumberOfStripes.Description="不明、実験的"
AMF.H264.IntraRefresh.MacroblocksPerSlot="スロットごとのマクロブロックのイントラ-リフレッシュ数"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="各スロットにいくつのマクロブロックを格納する必要があるか?\nゼロの値を指定するとこの機能は無効になります。\nイントラ-リフレッシュのエンコードは高速再生とシークに使用されます。"
-AMF.H264.VideoAPI="映像 API"
-AMF.H264.VideoAPI.Description="エンコードに使用する API。"
-AMF.H264.VideoAdapter="ビデオアダプター"
-AMF.H264.VideoAdapter.Description="エンコードに使用するアダプター。"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="エンコーダはOpenCLを使用して個々のフレームを送信する必要がありますか?"
-AMF.H264.View="表示モード"
-AMF.H264.View.Description="どのプロパティが表示されるか。 'エキスパート' または 'マスター' の表示モードを使用する場合はサポートを受けられません。"
-AMF.H264.View.Basic="基本"
-AMF.H264.View.Advanced="詳細設定"
-AMF.H264.View.Expert="エキスパート"
-AMF.H264.View.Master="マスター"
-AMF.H264.Debug="デバッグ"
-AMF.H264.Debug.Description="追加のデバッグログ出力を有効にし、これはこのエンコーダでサポートが必要なときにアクティブにする必要があります。"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/ko-KR.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/ko-KR.ini
Changed
-AMF.Util.Default="기본값"
-AMF.Util.Automatic="자동"
-AMF.Util.Manual="수동"
-AMF.Util.Toggle.Disabled="비활성화"
-AMF.Util.Toggle.Enabled="활성화"
-AMF.H264.Preset="사전 설정"
-AMF.H264.Preset.ResetToDefaults="기본값으로 복구"
-AMF.H264.Preset.Recording="녹화"
-AMF.H264.Preset.HighQuality="우수한 품질"
-AMF.H264.Preset.Indistinguishable="무손실과 거의 구분 불가"
-AMF.H264.Preset.Lossless="무손실"
-AMF.H264.Preset.Twitch="트위치"
-AMF.H264.Preset.YouTube="유튜브"
-AMF.H264.Usage="용도"
-AMF.H264.Usage.Description="용도에 따라 AMF을 조율합니다:\n- '트랜스코딩'은 일반적인 용도에 적합한 설정입니다(추천).\n- '매우 낮은 지연시간'은 아주 낮은 인코딩 레이턴시를 의미합니다.\n- '낮은 지연시간'은 그 수준이 조금 더 높습니다.\n방송 목적으로는 '트랜스코딩'만 지원됩니다. 모든 다른 설정값은 녹화 용도에만 사용됩니다."
-AMF.H264.Usage.Transcoding="트랜스코딩"
-AMF.H264.Usage.UltraLowLatency="매우 낮은 지연 시간"
-AMF.H264.Usage.LowLatency="낮은 지연 시간"
-AMF.H264.QualityPreset="품질 사전 설정"
-AMF.H264.QualityPreset.Description="AMF 품질을 결정할 때 어떤 수준을 목표로 할지 결정합니다:\n-'속도'는 가장 빠르지만, 품질이 가장 나쁘며,\n- '균형'은 '속도'와 '품질' 사이의 수준을 제공합니다.\n- '품질'은 주어진 비트레이트 내에서 가장 뛰어난 품질을 제공합니다."
-AMF.H264.QualityPreset.Speed="속도"
-AMF.H264.QualityPreset.Balanced="균형"
-AMF.H264.QualityPreset.Quality="품질"
-AMF.H264.Profile="프로파일"
-AMF.H264.Profile.Description="인코딩에 사용할 H.264 프로파일을 선택합니다:\n- 'Baseline'은 지원하는 플랫폼이 가장 많습니다,\n- 'Main'은 비교적 오래된 장치에서 지원합니다. (모바일 장치를 대상으로 한다면 추천하는 설정입니다),\n- 'High'는 현재 장치에서 지원하는 설정입니다 (추천)."
-AMF.H264.ProfileLevel="프로필 수준"
-AMF.H264.ProfileLevel.Description="인코딩으로 사용할 H.264 프로파일 수준을 지정합니다:\n-'자동'은 설정된 프레임과 그 크기에 맞춰 가장 좋은 수준을 계산합니다,\n-'4.1'은 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS을 지원합니다,\n-'4.2'는 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS을 지원합니다,\n- '5.0'은 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS을 지원합니다,\n- '5.1'은 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS을 지원합니다,\n- '5.2'은 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS을 지원합니다."
-AMF.H264.RateControlMethod="속도 제어 방식"
-AMF.H264.RateControlMethod.Description="속도 제어 동작 방식을 설정합니다:\n- '\@AMF.H264.RateControlMethod.CQP\@'는 고정된 I-/P-/B- 프레임 QP (양자화 매개변수) 값을 할당합니다, \n- '\@AMF.H264.RateControlMethod.CBR\@'은 주어진 목표 비트레이트를 유지합니다. (채우기 정보를 사용)(방송에 추천함),\n- '\@AMF.H264.RateControlMethod.VBR\@'은 주어진 최대 비트레이트보다 낮은 수준을 유지합니다,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@'는 GPU의 지연시간과 부하가 허용한다면 목표 비트레이트에 가까운 수준을 유지하며, 그렇지 않은 상황에서는 그보다 높은 비트레이트를 사용합니다. (녹화에 추천)."
-AMF.H264.RateControlMethod.CQP="고정 QP (CQP)"
-AMF.H264.RateControlMethod.CBR="고정 비트레이트 (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="가변 비트레이트 (최대 비트레이트 제약) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="가변 비트레이트(지연율 제약) (VBR_LAT)"
-AMF.H264.Bitrate.Target="목표 비트레이트"
-AMF.H264.Bitrate.Target.Description="전체 장면에서 달성하려는 비트레이트"
-AMF.H264.Bitrate.Peak="최대 비트레이트"
-AMF.H264.Bitrate.Peak.Description="전체 장면에서 달성하려는 비트레이트 최고조"
-AMF.H264.QP.Minimum="최소 QP"
-AMF.H264.QP.Minimum.Description="하나의 프레임에 사용하는 최저 QP (양자화 매개변수) 값"
-AMF.H264.QP.Maximum="최대 QP"
-AMF.H264.QP.Maximum.Description="하나의 프레임에 사용하는 최고 QP (양자화 매개변수) 값"
-AMF.H264.QP.IFrame="I-프레임 QP"
-AMF.H264.QP.IFrame.Description="I-화면에 사용할 고정 QP 값입니다."
-AMF.H264.QP.PFrame="P-프레임 QP"
-AMF.H264.QP.PFrame.Description="P-화면에 사용할 고정 QP 값입니다."
-AMF.H264.QP.BFrame="BP-프레임 QP"
-AMF.H264.QP.BFrame.Description="B-화면에 사용하는 고정 QP (양자화 매개변수) 값."
-AMF.H264.VBVBuffer="VBV 버퍼"
-AMF.H264.VBVBuffer.Description="VBV 버퍼 크기를 결정하는 방식을 설정합니다:\n-'자동'은 엄격도의 제약하에 크기를 계산합니다.\n-'수동'은 사용자가 크기를 조절할 수 있도록 허용합니다.\n일부 특정 속도제어 방식이 사용하는 VBV (Video Buffering Verifier) 버퍼는 제공된 제약 안에서 전반적인 비트레이트를 유지합니다."
-AMF.H264.VBVBuffer.Strictness="VBV 버퍼 엄격도"
-AMF.H264.VBVBuffer.Strictness.Description="VBV 버퍼의 엄격도를 설정합니다. 0%는 제약이 사라지고 100%는 최대한 제한을 합니다."
-AMF.H264.VBVBuffer.Size="VBV 버퍼 크기"
-AMF.H264.VBVBuffer.Size.Description="VBV의 크기는 한 장면의 비트레이트 제어에 사용하는 단위입니다."
-AMF.H264.VBVBuffer.Fullness="VBV 버퍼 충만도"
-AMF.H264.VBVBuffer.Fullness.Description="초기 VBV버퍼가 얼마나 충만한지는 오로지 인코딩의 초반에만 영향을 미칩니다."
-AMF.H264.FillerData="채우기 정보"
-AMF.H264.FillerData.Description="데이터 채우기는 인코더가 한 장면에 남은 공간을 빈 정보로 채워 목표 비트레이트를 유지할 수 있도록 합니다."
-AMF.H264.FrameSkipping="프레임 생략"
-AMF.H264.FrameSkipping.Description="프레임 생략은 인코더가 목표로 하는 비트레이트 요건을 맞추기 위해 프레임을 떨어뜨릴 수 있습니다.\n인코더가 프레임 하나를 떨어뜨리면 마지막 프레임 NAL을 대신 전송합니다.\n목표 비트레이트가 아주 낮을 때 도움이 될 수 있습니다."
-AMF.H264.EnforceHRDCompatibility="HDR 호환모드 적용"
-AMF.H264.EnforceHRDCompatibility.Description="이론적 기본 디코더 강제는 프레임 하나의 최대 QP값 변화를 제한하는 설정입니다.\n녹화나 방송에는 적합하지 않고 기본 소프트웨어 디코더만 사용 가능한 매우 낡은 장치에서 영상을 재생할 때 사용합니다."
-AMF.H264.KeyframeInterval="키프레임 간격"
-AMF.H264.KeyframeInterval.Description="손실이 불가능한 프레임 사이에 얼마나 많은 시간(초)이 필요한지 설정합니다.\n또한 영상(GOP) 크기도 제어합니다."
-AMF.H264.IDRPeriod="IDR 주기"
-AMF.H264.IDRPeriod.Description="프레임 내에서 순간 복호 갱신(nstantaneous Decoding Refreshes) 사이의 거리를 설정합니다. 또한, GOP-장면 크기를 제어합니다."
-AMF.H264.BFrame.Pattern="B-화면"
-AMF.H264.BFrame.Pattern.Description="인코딩에 얼마나 많은 B-화면을 사용할지 설정합니다.\n2, 3세대 VCE카드에서 지원합니다. 인코딩 성능에 부정적인 영향을 줍니다."
-AMF.H264.BFrame.DeltaQP="B-화면 델타 QP"
-AMF.H264.BFrame.DeltaQP.Description="비참조 B-화면에 쓰이는 마지막 I- 혹은P-화면의 델타 QP 값"
-AMF.H264.BFrame.Reference="참조가능한 B-화면"
-AMF.H264.BFrame.Reference.Description="P-와 I-화면뿐만 아니라 B-화면도 참조할 수 있도록 허용합니다."
-AMF.H264.BFrame.ReferenceDeltaQP="참조가능한 B-화면 델타 QP"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="비참조 B-화면에 쓰이는 마지막 I- 혹은P-화면의 델타 QP 값."
-AMF.H264.DeblockingFilter="디블록 필터"
-AMF.H264.DeblockingFilter.Description="디코더가 인코딩된 작업에 디블록 필터를 사용할 수 있도록 허용합니다."
-AMF.H264.ScanType="스캔 형식"
-AMF.H264.ScanType.Description="주사 방식을 설정합니다. 항상 '프로그레시브'로 두십시오."
-AMF.H264.ScanType.Progressive="프로그레시브"
-AMF.H264.ScanType.Interlaced="인터레이스"
-AMF.H264.MotionEstimation="동작 예측"
-AMF.H264.MotionEstimation.Description="동작 추정은 픽셀의 움직임을 추정하여 필요한 비트레이트를 줄일 수 있게 합니다."
-AMF.H264.MotionEstimation.None="없음"
-AMF.H264.MotionEstimation.Half="1/2 화소"
-AMF.H264.MotionEstimation.Quarter="1/4 화소"
-AMF.H264.MotionEstimation.Both="1/2 & 1/4 화소"
-AMF.H264.CodingType="부호화 형식"
-AMF.H264.CodingType.Description="부호화 형식을 결정합니다:\n* \@AMF.Util.Default\@ 은 AMF가 자동으로 선택합니다 (추천).\n* CALVC (문맥기반 적응적 가변길이 부호화) 는 빠르지만, 더 큽니다.\n* CABAC (문맥기반 적응적 이진산술 부호화) 는 느리지만, 더 작습니다."
+Utility.Default="기본값"
+Utility.Automatic="자동"
+Utility.Manual="수동"
+Utility.Switch.Disabled="비활성화"
+Utility.Switch.Enabled="활성화"
+Preset="사전 설정"
+Preset.ResetToDefaults="기본값으로 복구"
+Preset.Recording="녹화 중"
+Preset.HighQuality="우수한 품질"
+Preset.Indistinguishable="무손실과 거의 구분 불가"
+Preset.Lossless="무손실"
+Preset.Twitch="트위치"
+Preset.YouTube="유튜브"
+Usage="용도"
+Usage.Description="용도에 따라 AMF을 조율합니다:\n- '트랜스코딩'은 일반적인 용도에 적합한 설정입니다(추천).\n- '매우 낮은 지연시간'은 아주 낮은 인코딩 레이턴시를 의미합니다.\n- '낮은 지연시간'은 그 수준이 조금 더 높습니다.\n방송 목적으로는 '트랜스코딩'만 지원됩니다. 모든 다른 설정값은 녹화 용도에만 사용됩니다."
+Usage.Transcoding="트랜스코딩"
+Usage.UltraLowLatency="매우 낮은 지연 시간"
+Usage.LowLatency="낮은 지연 시간"
+Usage.Webcam="웹캠"
+QualityPreset="품질 사전 설정"
+QualityPreset.Description="AMF에서 목표로 하는 품질을 설정합니다:\n- '\@QualityPreset.Speed\@' 는 빠르지만 품질이 떨어지는 반면,\n- '\@QualityPreset.Balanced\@' 는 품질과 속도에 균형을 이룹니다.\n- '\@QualityPreset.Quality\@' 는 설정된 비트레이트 안에서 최고의 품질을 제공합니다."
+QualityPreset.Speed="속도"
+QualityPreset.Balanced="균형"
+QualityPreset.Quality="품질"
+Profile="프로파일"
+Profile.Description="인코딩에서 사용할 수 있는 프로파일을 나열했습니다."
+ProfileLevel="프로필 수준"
+ProfileLevel.Description="인코딩에서 사용할 프로필 수준을 결정합니다. \@Utility.Automatic\@로 둘 것을 권장합니다."
+Tier="단계"
+Tier.Description="인코딩에서 사용하는 단계를 설정합니다. Main이 기본 수준입니다. High는 높은 비트레이트를 필요로 하는 용도에 적합합니다."
+AspectRatio="가로세로 비율"
+AspectRatio.Description="출력 파일의 가로 세로 비율을 설정합니다."
+CodingType="부호화 형식"
+CodingType.Description="부호화 형식을 결정합니다:\n* \@AMF.Util.Default\@ 은 AMF가 자동으로 선택합니다 (추천).\n* CALVC (문맥기반 적응적 가변길이 부호화) 는 빠르지만, 더 큽니다.\n* CABAC (문맥기반 적응적 이진산술 부호화) 는 느리지만, 더 작습니다."
+MaximumReferenceFrames="최대 참조 프레임"
+MaximumReferenceFrames.Description="인코더가 얼마나 많은 프레임을 참조할지 설정합니다. 인코딩 품질에 큰 영향을 미칩니다."
+RateControlMethod="속도 제어 방식"
+RateControlMethod.Description="속도 제어 동작 방식을 설정합니다:\n- '\@AMF.H264.RateControlMethod.CQP\@'는 고정된 I-/P-/B- 프레임 QP (양자화 매개변수) 값을 할당합니다, \n- '\@AMF.H264.RateControlMethod.CBR\@'은 주어진 목표 비트레이트를 유지합니다. (채우기 정보를 사용)(방송에 추천함),\n- '\@AMF.H264.RateControlMethod.VBR\@'은 주어진 최대 비트레이트보다 낮은 수준을 유지합니다,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@'는 GPU의 지연시간과 부하가 허용한다면 목표 비트레이트에 가까운 수준을 유지하며, 그렇지 않은 상황에서는 그보다 높은 비트레이트를 사용합니다. (녹화에 추천)."
+RateControlMethod.CQP="고정 QP (CQP)"
+RateControlMethod.CBR="고정 비트레이트 (CBR)"
+RateControlMethod.VBR="가변 비트레이트 (최대 비트레이트 제약) (VBR)"
+RateControlMethod.VBRLAT="가변 비트레이트(지연율 제약) (VBR_LAT)"
+PrePassMode="프리-패스 모드"
+PrePassMode.Description="프리-패스 방식은 부가적인 처리를 하여 한 장면에서 비트레이트를 더 효율적으로 배분합니다. 카드마다 결과가 다를 수 있습니다."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (1/4 크기)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (1/2 크기)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (실물 크기)"
+Bitrate.Target="목표 비트레이트"
+Bitrate.Target.Description="전체 장면에서 달성하려는 비트레이트"
+Bitrate.Peak="최대 비트레이트"
+Bitrate.Peak.Description="전체 장면에서 달성하려는 비트레이트 최고조"
+QP.IFrame="I-프레임 QP"
+QP.IFrame.Description="I-화면에 사용할 고정 QP 값입니다."
+QP.PFrame="P-프레임 QP"
+QP.PFrame.Description="P-화면에 사용할 고정 QP 값입니다."
+QP.BFrame="B-프레임 QP"
+QP.BFrame.Description="B-화면에 사용할 고정 QP 값입니다."
+QP.Minimum="최소 QP"
+QP.Minimum.Description="프레임 한 장에 사용할 최저 QP 값입니다."
+QP.IFrame.Minimum="최소 I-프레임 QP"
+QP.IFrame.Minimum.Description="I-프레임 한 장에 사용할 최저 QP 값입니다."
+QP.PFrame.Minimum="최소 P-프레임 QP"
+QP.PFrame.Minimum.Description="P-프레임 한 장에 사용할 최저 QP 값입니다."
+QP.Maximum="최대 QP"
+QP.Maximum.Description="프레임 한 장에 사용할 최고 QP 값입니다."
+QP.IFrame.Maximum="최대 I-프레임 QP"
+QP.IFrame.Maximum.Description="I-프레임 한 장에 사용할 최고 QP 값입니다."
+QP.PFrame.Maximum="최대 P-프레임 QP"
+QP.PFrame.Maximum.Description="P-프레임 한 장에 사용할 최고 QP 값입니다."
+FillerData="채우기 정보"
+FillerData.Description="데이터 채우기는 인코더가 한 장면에 남은 공간을 빈 정보로 채워 목표 비트레이트를 유지할 수 있도록 합니다."
+FrameSkipping="프레임 생략"
+FrameSkipping.Description="프레임 생략은 인코더가 목표로 하는 비트레이트 요건을 맞추기 위해 프레임을 떨어뜨릴 수 있습니다.\n인코더가 프레임 하나를 떨어뜨리면 마지막 프레임 NAL을 대신 전송합니다.\n목표 비트레이트가 아주 낮을 때 도움이 될 수 있습니다."
+VBAQ="VBAQ"
+VBAQ.Description="'분산 기반 적응 양자화' (VBAQ) 를 사용하여 비트레이트를 효율적으로 배분합니다.\n이 방식은 인간의 시각 체계가 고밀도의 질감을 표현하는 영역에서 보이는 인공결함에 대해 덜 예민한 점을 이용했습니다. 따라서 동일한 비트레이트로 더 매끄러운 표면을 연출할 수 있습니다.\n처리하는 영상의 내용에 따라 품질에 대한 주관적인 평가가 개선될 수 있습니다."
+EnforceHRD="HRD 강제 적용"
+EnforceHRD.Description="가상 참조 디코더를 강제로 적용하여 출력 비트스트림을 검증합니다."
+VBVBuffer="VBV 버퍼"
+VBVBuffer.Description="VBV 버퍼 크기를 결정하는 방식을 설정합니다:\n-'자동'은 엄격도의 제약하에 크기를 계산합니다.\n-'수동'은 사용자가 크기를 조절할 수 있도록 허용합니다.\n일부 특정 속도제어 방식이 사용하는 VBV (Video Buffering Verifier) 버퍼는 제공된 제약 안에서 전반적인 비트레이트를 유지합니다."
+VBVBuffer.Strictness="VBV 버퍼 엄격도"
+VBVBuffer.Strictness.Description="VBV 버퍼의 엄격도를 설정합니다. 0%는 제약이 사라지고 100%는 최대한 제한을 합니다."
+VBVBuffer.Size="VBV 버퍼 크기"
+VBVBuffer.Size.Description="VBV의 크기는 한 장면의 비트레이트 제어에 사용하는 단위입니다."
+VBVBuffer.InitialFullness="VBV 버퍼 초기 충만도"
+VBVBuffer.InitialFullness.Description="초기 VBV버퍼가 얼마나 충만한지는 오로지 인코딩의 초반에만 영향을 미칩니다."
+KeyframeInterval="키프레임 간격"
+KeyframeInterval.Description="키프레임 사이의 간격 (초)."
+H264.IDRPeriod="IDR 기간 (프레임)"
+H264.IDRPeriod.Description="프레임 내에서 순간 복호 갱신(nstantaneous Decoding Refreshes) 사이의 거리를 설정합니다. 또한, GOP-장면 크기를 제어합니다."
+H265.IDRPeriod="IDR 기간 (프레임)"
+H265.IDRPeriod.Description="GOP 내에서 Instantaneous Decoding Refreshes (IDR) 의 간격을 설정합니다."
+GOP.Type="GOP 형식"
+GOP.Type.Description="GOP 형식을 결정합니다:\n- '\@GOP.Type.Fixed\@'는 GOP 간격을 항상 고정합니다.\n- '\@GOP.Type.Variable\@' 는 GOP 의 크기를 필요에 따라 조정합니다.\n'\@GOP.Type.Fixed\@'는 H264적용 방식이며 로컬 네트워크 스트리밍에 가장 알맞습니다. '\@GOP.Type.Variable\@'는 작은 크기, 고품질 녹화에 가장 알맞습니다."
+GOP.Type.Fixed="고정"
+GOP.Type.Variable="가변"
+GOP.Size="GOP 크기"
+GOP.Size.Description="프레임 내 GOP 크기"
+GOP.Size.Minimum="GOP 크기 최소"
+GOP.Size.Minimum.Description="프레임 내 GOP 크기를 최소화함"
+GOP.Size.Maximum="GOP 크기 최대"
+GOP.Size.Maximum.Description="프레임 내 GOP(Group of Pictures) 크기를 최대화함"
+GOP.Alignment="GOP 조정"
+GOP.Alignment.Description="현재 실험 및 개발 중인 기능입니다."
+BFrame.Pattern="B-화면 양상"
+BFrame.Pattern.Description="인코딩에 얼마나 많은 B-화면을 사용할지 설정합니다.\n2, 3세대 VCE카드에서 지원합니다. 인코딩 성능에 부정적인 영향을 줍니다."
+BFrame.DeltaQP="B-화면 델타 QP"
+BFrame.DeltaQP.Description="비참조 B-화면에 쓰이는 마지막 I- 혹은P-화면의 델타 QP 값"
+BFrame.Reference="B-프레임 참조"
+BFrame.Reference.Description="P-와 I-화면뿐만 아니라 B-화면도 참조할 수 있도록 허용합니다."
+BFrame.ReferenceDeltaQP="B-프레임 참조 델타 QP"
+BFrame.ReferenceDeltaQP.Description="참조 B-화면에 쓰이는 마지막 I- 혹은P-화면의 델타 QP 값."
+DeblockingFilter="디블록 필터"
+DeblockingFilter.Description="디코더가 디블록 필터를 적용합니다."
+MotionEstimation="동작 예측"
+MotionEstimation.Description="동작 추정은 픽셀의 움직임을 추정하여 필요한 비트레이트를 줄일 수 있게 합니다."
+MotionEstimation.Quarter="1/4 화소"
+MotionEstimation.Half="1/2 화소"
+MotionEstimation.Full="1/2- & 1/4 화소"
+Video.API="비디오 API"
+Video.API.Description="백엔드에서 어떤 API를 사용할까요?"
+Video.Adapter="비디오 어댑터:"
+Video.Adapter.Description="인코딩에서 어떤 어댑터를 사용합니까?"
+OpenCL="OpenCL"
+OpenCL.Description="OpenCL을 프레임 전송에 사용합니까? 기술적으로는 더 빠르지만 일부 인텔 드라이버와 문제가 발생할 수 있습니다. (OpenCL 라이브러리와 호환이 되지 않습니다.)"
+View="보기 모드"
+View.Description="얼마나 많은 설정을 표시할까요?\n참고하실 것은 '\@View.Master\@'을 사용하면 개발자로부터 지원을 받을 수 없습니다."
+View.Basic="기본"
+View.Advanced="고급"
+View.Expert="전문가"
+View.Master="달인"
+Debug="디버그"
+Debug.Description="추가적으로 디버그 안내를 제공합니다. OBS Studio를 '--verbose --log_unfiltered' 명령어와 함께 실행해야 합니다."
AMF.H264.MaximumLTRFrames="최대 장기참조 프레임"
AMF.H264.MaximumLTRFrames.Description="장기참조 프레임(LTR)은 인코더가 긴 시계에서 특정 프레임을 참조 가능하게 만들어 줍니다.\n장기참조 프레임은 B-화면과 동시에 사용할 수 없습니다."
AMF.H264.MaximumAccessUnitSize="최대 접근 유닛 크기"
AMF.H264.HeaderInsertionSpacing.Description="NAL 헤더 사이에 얼마나 많은 프레임이 필요한지 설정합니다. 0(자동)에서 바꾸는 것은 추천하지 않습니다."
AMF.H264.WaitForTask="작업을 대기"
AMF.H264.WaitForTask.Description="알수 없음, 실험적인 기능"
-AMF.H264.PreAnalysisPass="사전분석 처리"
-AMF.H264.PreAnalysisPass.Description="알수 없음, 실험적인 기능"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="알수 없음, 실험적인 기능"
-AMF.H264.GOPSize="GOP 크기"
-AMF.H264.GOPSize.Description="알수 없음, 실험적인 기능"
-AMF.H264.GOPAlignment="GOP 조정"
-AMF.H264.GOPAlignment.Description="알수 없음, 실험적인 기능"
-AMF.H264.MaximumReferenceFrames="최대 참조 프레임"
-AMF.H264.MaximumReferenceFrames.Description="알수 없음, 실험적인 기능"
AMF.H264.SlicesPerFrame="조각 당 프레임"
AMF.H264.SlicesPerFrame.Description="프레임마다 얼마나 많은 I-화면 조각을 저장할지 결정합니다.\n0값은 인코더가 상태에 따라 맞춰 조절합니다.\n인트라-리프레시 인코딩은 더 빠른 재생과 탐색을 위해 사용합니다."
AMF.H264.SliceMode="분할 모드"
AMF.H264.IntraRefresh.NumberOfStripes.Description="알수 없음, 실험적인 기능"
AMF.H264.IntraRefresh.MacroblocksPerSlot="슬롯 당 매크로블록의 인트라-리프레시 수"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="슬롯마다 얼마나 많은 매크로 블록을 저장할지 결정합니다.\n\0값은 인코더가 상태에 따라 맞춰 조절합니다.\n인트라-리프레시 인코딩은 더 빠른 재생과 탐색을 위해 사용합니다."
-AMF.H264.VideoAPI="비디오 API"
-AMF.H264.VideoAPI.Description="인코딩에 어떤 API를 사용할지 설정합니다."
-AMF.H264.VideoAdapter="비디오 어댑터:"
-AMF.H264.VideoAdapter.Description="인코딩에 어떤 어댑터를 사용할지 설정합니다."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="해당 인코더가 개별 프레임을 제출할 때 OpenCL을 사용하겠습니까?"
-AMF.H264.View="보기 모드"
-AMF.H264.View.Description="어떤 설정을 표시할지 결정합니다. '숙련' 혹은 '달인' 표시 모드에서는 지원을 받을 수 없습니다."
-AMF.H264.View.Basic="기본"
-AMF.H264.View.Advanced="고급"
-AMF.H264.View.Expert="숙련"
-AMF.H264.View.Master="달인"
-AMF.H264.Debug="디버그"
-AMF.H264.Debug.Description="추가적인 디버그 기록을 활성화하여 이 인코더에 대한 지원이 필요할 때 제출하십시오."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/nb-NO.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/nb-NO.ini
Changed
-AMF.Util.Default="Standard"
-AMF.Util.Automatic="Automatisk"
-AMF.Util.Toggle.Disabled="Deaktivert"
-AMF.Util.Toggle.Enabled="Aktivert"
-AMF.H264.Usage.Transcoding="Transkoding"
-AMF.H264.Usage.UltraLowLatency="Ultra-lav latens"
-AMF.H264.Usage.LowLatency="Lav latens"
-AMF.H264.QualityPreset="Kvalitet forhåndsinnstilling"
-AMF.H264.QualityPreset.Speed="Hastighet"
-AMF.H264.QualityPreset.Balanced="Balansert"
-AMF.H264.QualityPreset.Quality="Kvalitet"
-AMF.H264.Profile="Profil"
-AMF.H264.ProfileLevel="Profilnivå"
-AMF.H264.Bitrate.Target="Mål Bitrate"
-AMF.H264.Bitrate.Peak="Maks bitrate"
-AMF.H264.QP.Minimum="Minste QP"
-AMF.H264.QP.Maximum="Maksimal QP"
-AMF.H264.EnforceHRDCompatibility="Håndheve HRD kompatibilitet"
-AMF.H264.ScanType="Skanne Type"
+Utility.Default="Standard"
+Utility.Automatic="Automatisk"
+Utility.Manual="Manuell"
+Utility.Switch.Disabled="Deaktivert"
+Utility.Switch.Enabled="Aktivert"
+Preset="Preset"
+Preset.ResetToDefaults="Tilbakestill innstillinger"
+Preset.Recording="Opptak"
+Preset.HighQuality="Høy kvalitet"
+Preset.Lossless="Tapsfri"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage.Transcoding="Transkoding"
+Usage.Webcam="Webkamera"
+QualityPreset.Speed="Hastighet"
+QualityPreset.Balanced="Balansert"
+QualityPreset.Quality="Kvalitet"
+Profile="Profil"
+Profile.Description="Hvilke profil å kode med. Sortert fra best støttede (øverst) til beste kvalitet (nederst)."
+ProfileLevel="Profilnivå"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/nl-NL.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/nl-NL.ini
Changed
-AMF.Util.Default="Standaard"
-AMF.Util.Automatic="Automatisch"
-AMF.Util.Manual="Handmatig"
-AMF.Util.Toggle.Disabled="Uitgeschakeld"
-AMF.Util.Toggle.Enabled="Ingeschakeld"
-AMF.H264.Preset="Voorkeursinstelling"
-AMF.H264.Preset.ResetToDefaults="Standaardinstellingen herstellen"
-AMF.H264.Preset.Recording="Opname"
-AMF.H264.Preset.HighQuality="Hoge kwaliteit"
-AMF.H264.Preset.Indistinguishable="Ononderscheidbaar"
-AMF.H264.Preset.Lossless="Lossless"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Gebruik"
-AMF.H264.Usage.Description="Voor welk gebruik moet AMF ingesteld worden:\n- 'Transcoden' is standaard transcoden (aanbevolen),\n- 'Ultra Low Latency' is voor erg lage latency encoding,\n- 'Low Latency' is vergelijkbaar met bovenstaande met een iets hogere latency.\nVoor streamen is alleen 'Transcoden' mogelijk, alle andere waarden kunnen worden gebruikt voor opnemen."
-AMF.H264.Usage.Transcoding="Transcoding"
-AMF.H264.Usage.UltraLowLatency="Ultra Low Latency"
-AMF.H264.Usage.LowLatency="Low Latency"
-AMF.H264.QualityPreset="Kwaliteitsinstelling"
-AMF.H264.QualityPreset.Description="Welke kwaliteitsinstelling AMF moet proberen te halen:\n- '\@AMF.H264.QualityPreset.Speed\@' is de snelste maar heeft de slechtste kwaliteit,\n- '\@AMF.H264.QualityPreset.Balanced\@' is een gebalanceerde mix van beide,\n- '\@AMF.H264.QualityPreset.Quality\@' geeft de beste kwaliteit voor een gekozen bitrate."
-AMF.H264.QualityPreset.Speed="Snelheid"
-AMF.H264.QualityPreset.Balanced="Gebalanceerd"
-AMF.H264.QualityPreset.Quality="Kwaliteit"
-AMF.H264.Profile="Profiel"
-AMF.H264.Profile.Description="Welk h.264 profiel gebruikt moet worden, gesorteerd van beste kwaliteit tot breedste ondersteuning."
-AMF.H264.ProfileLevel="Profielniveau"
-AMF.H264.ProfileLevel.Description="Welk H.264 profielniveau moet worden gebruikt voor encoden:\n- 'Automatisch' berekent het beste profielniveau voor de gebruikte framerate en framegrootte.\n- '4.1' ondersteunt 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' ondersteunt 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' ondersteunt1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' ondersteunt3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' ondersteunt 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Rate control methode"
-AMF.H264.RateControlMethod.Description="Welke rate control methode gebruikt moet worden:\n- '\@AMF.H264.RateControlMethod.CQP\@' kent vaste I-/P-/B-Frame QP waardes toe,\n- '\@AMF.H264.RateControlMethod.CBR\@' blijft op de insgestelde doelbitrate (met opvuldata) (aanbevolen voor streamen),\n- '\@AMF.H264.RateControlMethod.VBR\@' blijft onder de ingestelde piek bitrate,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' blijft in de buurt van de doelbitrate als de GPU latency en belasting het toestaan, anders zal er een hogere bitrate worden gebruikt (aanbevolen voor opnames)."
-AMF.H264.RateControlMethod.CQP="Constant QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Constant Bitrate (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Variable Bitrate (Peak Constrained) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Variable Bitrate (Latency Constrained) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Doelbitrate"
-AMF.H264.Bitrate.Target.Description="Bitrate om te proberen aan te houden in de gehele reeks."
-AMF.H264.Bitrate.Peak="Maximale bitrate"
-AMF.H264.Bitrate.Peak.Description="Bitrate om in de gehele reeks maximaal naartoe te pieken."
-AMF.H264.QP.Minimum="Minimale QP"
-AMF.H264.QP.Minimum.Description="Laagste QP waarde om te gebruiken in een frame."
-AMF.H264.QP.Maximum="Maximale QP"
-AMF.H264.QP.Maximum.Description="Hoogste QP waarde om te gebruiken in een frame."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Vaste QP waarde om te gebruiken voor I-frames."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Vaste QP waarde om te gebruiken voor P-frames."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Vaste QP waarde om te gebruiken voor B-frames."
-AMF.H264.VBVBuffer="VBV Buffer"
-AMF.H264.VBVBuffer.Description="Welke methode moet worden gebruikt om de VBV buffergrootte te bepalen:\n- '\@AMF.Util.Automatic\@' berekent de grootte met een strengheid-instelling.\n- '\@AMF.Util.Manual\@' laat de gebruiker de grootte bepalen.\nVBV (Video Buffering Verifier) buffer wordt gebruikt door bepaalde Rate Control methodes om de algehele bitrate binnen de aangegeven perken te houden."
-AMF.H264.VBVBuffer.Strictness="VBV Buffer-strengheid"
-AMF.H264.VBVBuffer.Strictness.Description="Bepaalt de strengheid van de VBV buffer, waar 100% zo streng mogelijk is, en 0% geen restricties oplegt."
-AMF.H264.VBVBuffer.Size="VBV Buffergrootte"
-AMF.H264.VBVBuffer.Size.Description="De grootte van de VBV Buffer welke wordt gebruikt voor bitrate controle in een reeks."
-AMF.H264.VBVBuffer.Fullness="VBV Buffervulling"
-AMF.H264.VBVBuffer.Fullness.Description="Hoe vol de VBV Buffer initieel is, beïnvloedt alleen de initiële reeks van encoden."
-AMF.H264.FillerData="Opvuldata"
-AMF.H264.FillerData.Description="Inschakelen van opvuldata laat de encoder tenminste de doelbitrate aanhouden door de overtollige ruimte in een reeks te vullen met lege informatie."
-AMF.H264.FrameSkipping="Frames overslaan"
-AMF.H264.FrameSkipping.Description="Frame Skipping laat een encoder frames droppen om de doelbitrate te halen.\nAls de encoder een frame dropt vult het een herhaal-laatste-frame NAL in de stream.\nKan helpen bij erg lage doelbitrates."
-AMF.H264.EnforceHRDCompatibility="Forceer HDR compatibiliteit"
-AMF.H264.EnforceHRDCompatibility.Description="Forceer hypothetische referentiedecoder-restricties welke de maximale QP waarde veranderen binnen een frame."
-AMF.H264.KeyframeInterval="Keyframe-Interval"
-AMF.H264.KeyframeInterval.Description="Bepaalt de afstand tussen keyframes in seconden. Bepaalt ook de GOP-sequence size."
-AMF.H264.IDRPeriod="IDR Periode"
-AMF.H264.IDRPeriod.Description="Bepaalt de afstand tussen Instantaneous Decoding Refreshes (IDR) in frames. Bepaalt ook de GOP-sequence size."
-AMF.H264.BFrame.Pattern="B-frames"
-AMF.H264.BFrame.Pattern.Description="Het aantal B-frames om te gebruiken tijdens het encoden.\nOndersteund door 2e en 3e generatie VCE-kaarten. Negatieve invloed op encodingprestaties."
-AMF.H264.BFrame.DeltaQP="Delta QP voor B-frames"
-AMF.H264.BFrame.DeltaQP.Description="Delta QP waarde tot de laatste I- of P-frame voor niet-refereerbare B-frames."
-AMF.H264.BFrame.Reference="Refereerbare B-frames"
-AMF.H264.BFrame.Reference.Description="Laat een B-frame ook B-frames gebruiken als referentie, in plaats van enkel P- en I-frames."
-AMF.H264.BFrame.ReferenceDeltaQP="Delta QP voor refereerbare B-frames"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Delta QP waarde tot de laatste I- of P-frame voor refereerbare B-frames."
-AMF.H264.DeblockingFilter="Deblocking Filter"
-AMF.H264.DeblockingFilter.Description="Staat de encoder toe om een Deblocking Filter te gebruiken voor de gecodeerde stream."
-AMF.H264.ScanType="Scantype"
-AMF.H264.ScanType.Description="Welke scanmethode gebruikt moet worden, laat dit altijd op '\@AMF.H264.ScanType.Progressive\@'."
-AMF.H264.ScanType.Progressive="Progressive"
-AMF.H264.ScanType.Interlaced="Interlaced"
-AMF.H264.MotionEstimation="Bewegingsschatting"
-AMF.H264.MotionEstimation.Description="Bewigingsschatting laat de encoder de benodigde bitrate verlagen door te schatten waar een pixel heen ging."
-AMF.H264.MotionEstimation.None="Geen"
-AMF.H264.MotionEstimation.Half="Halve pixel"
-AMF.H264.MotionEstimation.Quarter="Kwartpixel"
-AMF.H264.MotionEstimation.Both="Halve en kwartpixel"
-AMF.H264.CodingType="Codeertype"
-AMF.H264.CodingType.Description="Welk codeertype gebruikt moet worden:\n* \@AMF.Util.Default\@ laat AMF bepalen (aanbevolen).\n* CALVC (Context-Adaptive Variable-Length Coding) is sneller, maar groter.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) is langzamer, maar kleiner."
+Utility.Default="Standaard"
+Utility.Automatic="Automatisch"
+Utility.Manual="Handmatig"
+Utility.Switch.Disabled="Uitgeschakeld"
+Utility.Switch.Enabled="Ingeschakeld"
+Preset="Voorkeursinstelling"
+Preset.ResetToDefaults="Standaardinstellingen herstellen"
+Preset.Recording="Opname"
+Preset.HighQuality="Hoge kwaliteit"
+Preset.Indistinguishable="Ononderscheidbaar"
+Preset.Lossless="Lossless"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Gebruik"
+Usage.Description="Voor welk gebruik moet AMF ingesteld worden:\n- '\@Usage.Transcoding\@' is standaard transcoden (aanbevolen),\n- '\@Usage.UltraLowLatency\@' is voor erg lage latency encoding,\n- '\@Usage.LowLatency\@' is vergelijkbaar met bovenstaande met een iets hogere latency.\nVoor streamen is alleen '\@Usage.Transcoding\@' mogelijk, alle andere waarden kunnen worden gebruikt voor opnemen."
+Usage.Transcoding="Transcoding"
+Usage.UltraLowLatency="Ultra Low Latency"
+Usage.LowLatency="Low Latency"
+Usage.Webcam="Webcam"
+QualityPreset="Kwaliteitsinstelling"
+QualityPreset.Description="Welke kwaliteitsinstelling AMF moet proberen te halen:\n- '\@QualityPreset.Speed\@' is de snelste maar heeft de slechtste kwaliteit,\n- '\@QualityPreset.Balanced\@' is een gebalanceerde mix van beide,\n- '\@QualityPreset.Quality\@' geeft de beste kwaliteit voor een gekozen bitrate."
+QualityPreset.Speed="Snelheid"
+QualityPreset.Balanced="Gebalanceerd"
+QualityPreset.Quality="Kwaliteit"
+Profile="Profiel"
+Profile.Description="Met welk profiel geëncodet moet worden. Gesorteerd van best ondersteunt (boven) tot beste kwaliteit (onder)."
+ProfileLevel="Profielniveau"
+Tier="Niveau"
+AspectRatio="Beeldverhouding"
+CodingType="Codeertype"
+MaximumReferenceFrames="Maximale referentieframes"
+RateControlMethod="Rate control methode"
+RateControlMethod.CQP="Constant QP (CQP)"
+RateControlMethod.CBR="Constant Bitrate (CBR)"
+RateControlMethod.VBR="Variable Bitrate (Peak Constrained) (VBR)"
+RateControlMethod.VBRLAT="Variable Bitrate (Latency Constrained) (VBRLAT)"
+Bitrate.Target="Doelbitrate"
+VBVBuffer.InitialFullness="Initiële VBV Buffervulling"
+KeyframeInterval="Keyframe-Interval"
+GOP.Type.Fixed="Vast"
+GOP.Type.Variable="Variabel"
+GOP.Size="GOP-grootte"
+GOP.Alignment="GOP-uitlijning"
+BFrame.Pattern.Description="Het aantal B-frames om te gebruiken tijdens het encoden.\nOndersteund door 2e en 3e generatie VCE-kaarten. Negatieve invloed op encodingprestaties."
+BFrame.DeltaQP.Description="Delta QP waarde tot de laatste I- of P-frame voor niet-refereerbare B-frames."
+BFrame.Reference.Description="Laat een B-frame ook B-frames gebruiken als referentie, in plaats van enkel P- en I-frames."
+BFrame.ReferenceDeltaQP.Description="Delta QP waarde tot de laatste I- of P-frame voor refereerbare B-frames."
+DeblockingFilter="Deblocking Filter"
+MotionEstimation="Bewegingsschatting"
+MotionEstimation.Description="Bewigingsschatting laat de encoder de benodigde bitrate verlagen door te schatten waar een pixel heen ging."
+MotionEstimation.Quarter="Kwartpixel"
+MotionEstimation.Half="Halve pixel"
+MotionEstimation.Full="Kwart- & Halve Pixel"
+Video.API="Video API"
+Video.API.Description="Welke API moet de backend gebruiken?"
+Video.Adapter="Videoadapter"
+Video.Adapter.Description="Op welke adapter moeten we proberen te encoden?"
+OpenCL="OpenCL"
+View="Weergavemodus"
+View.Basic="Simpel"
+View.Advanced="Geavanceerd"
+View.Expert="Expert"
+View.Master="Meester"
+Debug="Debug"
AMF.H264.MaximumLTRFrames="Maximale LTR Frames"
AMF.H264.MaximumLTRFrames.Description="Long Term Reference (LTR) frames zijn een functie waarmee de encoder bepaalde frames in een reeks kan aanmerken als refereerbaar gedurende een lange tijd.\nLTR frames kunnen niet met B-frames gebruikt worden, en de encoder zal B-frames ook uitzetten als deze gebruikt worden."
AMF.H264.MaximumAccessUnitSize="Maximale Access Unit grootte"
AMF.H264.HeaderInsertionSpacing.Description="Hoeveel frames er tussen NAL headers moeten zitten."
AMF.H264.WaitForTask="Wacht op taak"
AMF.H264.WaitForTask.Description="Onbekend, experimenteel"
-AMF.H264.PreAnalysisPass="Pre-analyse pass"
-AMF.H264.PreAnalysisPass.Description="Onbekend, experimenteel"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Onbekend, experimenteel"
-AMF.H264.GOPSize="GOP-grootte"
-AMF.H264.GOPSize.Description="Onbekend, experimenteel"
-AMF.H264.GOPAlignment="GOP-uitlijning"
-AMF.H264.GOPAlignment.Description="Onbekend, experimenteel"
-AMF.H264.MaximumReferenceFrames="Maximale referentieframes"
-AMF.H264.MaximumReferenceFrames.Description="Onbekend, experimenteel"
AMF.H264.SlicesPerFrame="Segmenten per frame"
AMF.H264.SlicesPerFrame.Description="Hoeveel I-frame segmenten moeten er worden opgeslagen bij elke frame?\nEen waarde van nul laat de encoder bepalen.\nIntra-refresh encoding wordt gebruikt voor snellere weergave van en zoeken door video."
AMF.H264.SliceMode="Slice-modus"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Onbekend, experimenteel"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Intra-Refresh Macroblocks per Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Hoeveel Macroblocks moeten er in elk slot worden opgeslagen?\nEen waarde van 0 schakelt deze functie uit.\nIntra-Refresh encoding wordt gebruikt voor snellere weergave van en zoeken door video."
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAPI.Description="Welke API gebruikt moet worden voor encoden."
-AMF.H264.VideoAdapter="Videoadapter"
-AMF.H264.VideoAdapter.Description="Welke apter gebruikt moet worden voor encoding."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Moet de encoder OpenCL gebruiken om de individuele frames te leveren?"
-AMF.H264.View="Weergavemodus"
-AMF.H264.View.Description="Welke eigenschappen moeten getoond worden?\nHet gebruiken van '\@AMF.H264.View.Master\@' diskwalificeert je voor het ontvangen van ondersteuning."
-AMF.H264.View.Basic="Simpel"
-AMF.H264.View.Advanced="Geavanceerd"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Meester"
-AMF.H264.Debug="Debug"
-AMF.H264.Debug.Description="Schakel extra debug-logging in, dit moet actief zijn als je ondersteuning nodig hebt met deze encoder."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/pl-PL.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/pl-PL.ini
Changed
-AMF.Util.Default="Domyślne"
-AMF.Util.Automatic="Automatycznie"
-AMF.Util.Manual="Ustawienia własne"
-AMF.Util.Toggle.Disabled="Wyłączone"
-AMF.Util.Toggle.Enabled="Włączone"
-AMF.H264.Preset="Profil"
-AMF.H264.Preset.ResetToDefaults="Przywróć ustawienia domyślne"
-AMF.H264.Preset.Recording="Nagrywanie"
-AMF.H264.Preset.HighQuality="Wysoka jakość"
-AMF.H264.Preset.Indistinguishable="Nie do odróżnienia"
-AMF.H264.Preset.Lossless="Bezstratny"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Sposób użycia"
-AMF.H264.Usage.Transcoding="Konwersja"
-AMF.H264.Usage.UltraLowLatency="Bardzo niskie opóźnienie"
-AMF.H264.Usage.LowLatency="Niskie opóźnienie"
-AMF.H264.QualityPreset="Ustawienie jakości"
-AMF.H264.QualityPreset.Speed="Szybkość"
-AMF.H264.QualityPreset.Balanced="Zrównoważone"
-AMF.H264.QualityPreset.Quality="Jakość"
-AMF.H264.Profile="Profil"
-AMF.H264.ProfileLevel="Profil"
-AMF.H264.RateControlMethod="Metoda kontroli przepływności"
-AMF.H264.RateControlMethod.CQP="Stała QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Stała przepływność (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Zmienna przepływność (z ograniczeniem górnym) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Zmienna przepływność (z ograniczeniem opóźnieniem) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Przepływność docelowa"
-AMF.H264.Bitrate.Target.Description="Średni bitrate do uzyskania w sekwencji."
-AMF.H264.Bitrate.Peak="Przepływność szczytowa"
-AMF.H264.Bitrate.Peak.Description="Maksymalny bitrate do uzyskania w sekwencji."
-AMF.H264.QP.Minimum="Minimalna QP"
-AMF.H264.QP.Minimum.Description="Najniższa wartość QP (parametr kwantyzacji) do użycia w ramce."
-AMF.H264.QP.Maximum="Maksymalna QP"
-AMF.H264.QP.Maximum.Description="Najwyższa wartość QP (parametr kwantyzacji) do użycia w ramce."
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.IFrame.Description="Stała wartość QP dla ramek I-Frame."
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.PFrame.Description="Stała wartość QP dla ramek P-Frame."
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.QP.BFrame.Description="Stała wartość QP (parametr kwantyzacji) do użycia w ramce B-Frame."
-AMF.H264.VBVBuffer="Bufor VBV"
-AMF.H264.VBVBuffer.Strictness="Odchylenie bufora VBV"
-AMF.H264.VBVBuffer.Strictness.Description="Określa rygor wykorzystania bufora VBV, gdzie 100% oznacza pełne podporządkowanie wartości a 0% - dowolne."
-AMF.H264.VBVBuffer.Size="Rozmiar bufora VBV"
-AMF.H264.VBVBuffer.Size.Description="Rozmiar bufora VBV używanego w kontroli przepływności sekwencji kodowania."
-AMF.H264.VBVBuffer.Fullness="Zapełnienie bufora VBV"
-AMF.H264.VBVBuffer.Fullness.Description="Jak bardzo zapełniony na starcie powinien być bufor VBV. Wpływa jedynie na początkową sekwencję kodowania."
-AMF.H264.FillerData="Filler Data"
-AMF.H264.FrameSkipping="Pomijanie klatek"
-AMF.H264.EnforceHRDCompatibility="Wymuszanie zgodności HRD"
-AMF.H264.KeyframeInterval="Interwał klatki kluczowej"
-AMF.H264.KeyframeInterval.Description="Określa odległość (w sekundach) między klatkami kluczowymi oraz kontroluje rozmiar GOP."
-AMF.H264.IDRPeriod="Okres IDR"
-AMF.H264.IDRPeriod.Description="Określa (w klatkach) odległość między natychmiastowymi odświeżeniami dekodera. Kontroluje również rozmiar sekwencji GOP.
-"
-AMF.H264.DeblockingFilter="Filtr niwelujacy bloki obrazu"
-AMF.H264.ScanType="Metoda skanowania"
-AMF.H264.ScanType.Progressive="Progresywne"
-AMF.H264.ScanType.Interlaced="Z przeplotem"
-AMF.H264.MotionEstimation="Szacowania ruchu"
-AMF.H264.MotionEstimation.None="Żaden"
-AMF.H264.MotionEstimation.Half="Pół piksela"
-AMF.H264.MotionEstimation.Quarter="Kwartał piksela"
-AMF.H264.MotionEstimation.Both="Pół i kwartał piksela"
-AMF.H264.CodingType="Typ kodowania"
+Utility.Default="Domyślnie"
+Utility.Automatic="Automatycznie"
+Utility.Manual="Ustawienia własne"
+Utility.Switch.Disabled="Wyłączone"
+Utility.Switch.Enabled="Włączone"
+Preset="Profil"
+Preset.ResetToDefaults="Przywróć ustawienia domyślne"
+Preset.Recording="Nagrywanie"
+Preset.HighQuality="High Quality (wysoka jakość)"
+Preset.Indistinguishable="Nie do odróżnienia"
+Preset.Lossless="Lossless (bezstratny)"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Sposób użycia"
+Usage.Transcoding="Konwersja"
+Usage.UltraLowLatency="Bardzo niskie opóźnienie"
+Usage.LowLatency="Niskie opóźnienie"
+Usage.Webcam="Kamerka internetowa"
+QualityPreset="Ustawienie jakości"
+QualityPreset.Speed="Szybkość"
+QualityPreset.Balanced="Zrównoważone"
+QualityPreset.Quality="Jakość"
+Profile="Profil"
+ProfileLevel="Profil - poziom"
+Tier="Poziom"
+AspectRatio="Współczynnik proporcji"
+CodingType="Typ kodowania"
+AMF.H264.MaximumLTRFrames="Maksimum ramek LTR"
+AMF.H264.MaximumLTRFrames.Description="Long Term Reference (LTR) Frames (ramki LTR) to opcja pozwalająca na oznaczenie przez enkoder ramek, które mają być referencyjnymi przez dłuższy czas.\nRamki LTR nie mogą być używane wraz z B-ramkami. Enkoder wyłączy B-ramki w przypadku użycia opcji ramek LTR."
AMF.H264.MaximumAccessUnitSize="Maksymalny rozmiar Access Unit"
AMF.H264.MaximumAccessUnitSize.Description="Maksymalny rozmiar Access Unit. Wartość 0 umożliwia enkoderowi wybranie najlepszej wartości."
AMF.H264.HeaderInsertionSpacing="Rozmiar nagłówka (w klatkach)"
AMF.H264.HeaderInsertionSpacing.Description="Ile klatek powinno być między nagłówkami. Nie zaleca się zmieniać wartości na inną niż 0 (automatycznie)."
-AMF.H264.VideoAPI="Typ API"
-AMF.H264.VideoAPI.Description="Rodzaj API wykorzystywanego do enkodowania."
-AMF.H264.VideoAdapter="Karta graficzna"
-AMF.H264.VideoAdapter.Description="Karta graficzna wykorzystywana do enkodowania."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Czy enkoder ma użyć OpenCL do wysyłania poszczególnych klatek?"
-AMF.H264.View="Tryb ustawień"
-AMF.H264.View.Description="Którego typu ustawień używać.\nWłączenie '\@AMF.H264.View.Master\@' oznacza rezygnację ze wsparcia technicznego."
-AMF.H264.View.Basic="Podstawowy"
-AMF.H264.View.Advanced="Zaawansowany"
-AMF.H264.View.Expert="Ekspercki"
-AMF.H264.View.Master="Pełny"
-AMF.H264.Debug="Debugowanie"
-AMF.H264.Debug.Description="Włącza dodatkowe opcje logowania w trybie debug. Przydatne w przypadku poszukiwania wsparcia technicznego dla tego enkodera."
+AMF.H264.WaitForTask="Wait For Task"
+AMF.H264.WaitForTask.Description="Nieznany, eksperymentalne"
+AMF.H264.SlicesPerFrame="Liczba bloków w jednej klatce"
+AMF.H264.SlicesPerFrame.Description="Ile bloków I-ramek powinno być zapisanych w jednej ramce?\nWartość 0 oznacza, że enkoder decyduje automatycznie.\nEnkodowanie Intra-Refresh używane jest w celu szybszego odtwarzania i przewijania."
+AMF.H264.SliceMode="Slice Mode"
+AMF.H264.SliceMode.Description="Nieznany, eksperymentalne"
+AMF.H264.MaximumSliceSize="Maksymalny rozmiar bloku"
+AMF.H264.MaximumSliceSize.Description="Nieznany, eksperymentalne"
+AMF.H264.SliceControlMode="Tryb kontroli bloku"
+AMF.H264.SliceControlMode.Description="Nieznany, eksperymentalne"
+AMF.H264.SliceControlSize="Rozmiar kontroli bloku"
+AMF.H264.SliceControlSize.Description="Nieznany, eksperymentalne"
+AMF.H264.IntraRefresh.NumberOfStripes="Liczba pasków Intra-Refresh"
+AMF.H264.IntraRefresh.NumberOfStripes.Description="Nieznany, eksperymentalne"
+AMF.H264.IntraRefresh.MacroblocksPerSlot="Liczba makrobloków Intra-Refresh na slot"
+AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Ile makrobloków należy zapisać w jednym slocie?\n0 wyłącza opcję\nEnkodowanie Intra-Refresh pozwala na szybsze odtwarzanie i przewijanie."
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/pt-BR.ini
Added
+AMF.H264.SliceControlMode.Description="Desconhecido, Experimental"
+AMF.H264.SliceControlSize.Description="Desconhecido, Experimental"
+
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/ru-RU.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/ru-RU.ini
Changed
-AMF.Util.Default="По умолчанию"
-AMF.Util.Automatic="Автоматически"
-AMF.Util.Manual="Руководство"
-AMF.Util.Toggle.Disabled="Выключено"
-AMF.Util.Toggle.Enabled="Включено"
-AMF.H264.Preset="Предустановка"
-AMF.H264.Preset.ResetToDefaults="Сброс по умолчанию"
-AMF.H264.Preset.Recording="Запись"
-AMF.H264.Preset.HighQuality="Высокое качество"
-AMF.H264.Preset.Indistinguishable="Незаметные потери"
-AMF.H264.Preset.Lossless="Без потерь"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Назначение"
-AMF.H264.Usage.Description="На какой режим использования должен быть настроен AMF:\n- 'Перекодировка' — для перекодировки общего назначения (рекомендуется),\n- 'Очень низкая задержка' — для кодировки с очень низкой задержкой,\n- 'Низкая задержка' похожа на пункт выше, но с немного большей задержкой.\nТрансляции поддерживают только 'перекодировку', все остальные значения могут быть использованы для записи."
-AMF.H264.Usage.Transcoding="Транскодирование"
-AMF.H264.Usage.UltraLowLatency="Очень низкая задержка"
-AMF.H264.Usage.LowLatency="Низкая задержка"
-AMF.H264.QualityPreset="Предустановки качества"
-AMF.H264.QualityPreset.Description="К какому пресету качества AMF нужно стремиться:\n-'Speed' самый быстрый, но имеет самое низкое качество,\n-'Balanced' между 'Speed' и 'Quality', предоставляет хорошее соотношение между скоростью и качеством,\n-'Quality' предоставляет наилучшее качество для предоставленного битрейта."
-AMF.H264.QualityPreset.Speed="Скорость"
-AMF.H264.QualityPreset.Balanced="Баланс"
-AMF.H264.QualityPreset.Quality="Качество"
-AMF.H264.Profile="Профиль кодирования"
-AMF.H264.Profile.Description="Какой профиль формата H.264, использовать для кодирования, отсортированный от высочайшего качества для самой широкой поддержки."
-AMF.H264.ProfileLevel="Уровень профиля"
-AMF.H264.ProfileLevel.Description="Какой уровень профиля H.264 использовать для кодирования:\n- 'Автоматически' выбирает наиболее подходящий уровень под выбранную частоту и размер кадра,\n- '4.1' поддерживает 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' поддерживает 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' поддерживает 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' поддерживает 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' поддерживает 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="Метод кодирования"
-AMF.H264.RateControlMethod.Description="Как оценить способ контроля должны быть использованы:\n- '\@AMF.H264.RateControlMethod.CQP\@' назначает фиксированные I-/P-/B-Значения кадра QP,\n- '\@AMF.H264.RateControlMethod.CBR\@' остается в данном целевом Битрейте (используя данные заполнителя) (рекомендуется для стрима),\n-'\@AMF.H264.RateControlMethod.VBR\@' остается ниже пика Битрейта,\n-'\@AMF.H264.RateControlMethod.VBR_LAT\@' остается близко к целевому Битрейту, если задержка GPU и нагрузки позволяют, в противном случае будут использовать более высокий Битрейт (рекомендуется для стрима)."
-AMF.H264.RateControlMethod.CQP="CQP: постоянное качество"
-AMF.H264.RateControlMethod.CBR="CBR: постоянный битрейт"
-AMF.H264.RateControlMethod.VBR.Peak="VBR: переменный битрейт (Ограничение пиков)"
-AMF.H264.RateControlMethod.VBR.Latency="VBR: переменный битрейт (Ограничение задержки)"
-AMF.H264.Bitrate.Target="Битрейт"
-AMF.H264.Bitrate.Target.Description="Битрейт желаемый для получения во всей последовательности."
-AMF.H264.Bitrate.Peak="Пиковый битрейт"
-AMF.H264.Bitrate.Peak.Description="Максимальный пиковый битрейт желаемый для получения во всей последовательности."
-AMF.H264.QP.Minimum="Минимальное QP"
-AMF.H264.QP.Minimum.Description="Наименьшее значение QP (Параметр квантования) для использования в кадре."
-AMF.H264.QP.Maximum="Максимальное QP"
-AMF.H264.QP.Maximum.Description="Наибольшее значение QP (Параметр квантования) для использования в кадре."
-AMF.H264.QP.IFrame="I-кадр QP"
-AMF.H264.QP.IFrame.Description="Фиксированное значение QP используемое для I-Кадров."
-AMF.H264.QP.PFrame="P-кадр QP"
-AMF.H264.QP.PFrame.Description="Фиксированное значение QP используемое для P-Кадров."
-AMF.H264.QP.BFrame="B-кадр QP"
-AMF.H264.QP.BFrame.Description="Фиксированное значение QP (Параметр квантования) для использования в B-Кадрах."
-AMF.H264.VBVBuffer="Буфер VBV"
-AMF.H264.VBVBuffer.Description="Какой метод следует использовать для определения Размера буфера VBV:\n- '\@AMF.Util.Automatic\@' рассчитывает размер ограничивающийся строгостью,\n- '\@AMF.Util.Manual\@' позволяет пользователю контролировать размер буфера.\nVBV (Верификатор буферизации видео) буфер используется некоторыми Методами управления скоростью, чтобы сохранить общий битрейт в пределах заданных ограничений."
-AMF.H264.VBVBuffer.Strictness="Строгость буфера VBV"
-AMF.H264.VBVBuffer.Strictness.Description="Определяет жесткость буфера vbv, при 100% будет максимально жестким и 0% будет неограниченным."
-AMF.H264.VBVBuffer.Size="Размер буфера VBV"
-AMF.H264.VBVBuffer.Size.Description="Размер буфера vbv, которая используется для управления скорости передачи данных в последовательности."
-AMF.H264.VBVBuffer.Fullness="Заполнение буфера VBV"
-AMF.H264.VBVBuffer.Fullness.Description="Изначальная степень заполнения VBV буфера, будет влиять только на первоначальную последовательность кодирования."
-AMF.H264.FillerData="Данные наполнителя"
-AMF.H264.FillerData.Description="Включение данных наполнителей позволяет кодировщику сохранять, по крайней мере, целевой Битрейт, заполняя оставшееся пространство в последовательности с пустой информации."
-AMF.H264.FrameSkipping="Пропуск кадров"
-AMF.H264.FrameSkipping.Description="Пропуск кадров позволяет кодировщику падение кадров в целях соответствия требованиям целевого битрейта.\nКогда в энкодере пропадает рамка вместо этого вставьте повторять-последний кадр NAL в стрим.\nМожет помочь с очень низким битрейтом."
-AMF.H264.EnforceHRDCompatibility="Принудительная HRD совместимость"
-AMF.H264.EnforceHRDCompatibility.Description="Соблюдение гипотетическому эталону Декодера ограничения, которые ограничивают максимальное значение QP изменения в кадре."
-AMF.H264.KeyframeInterval="Интервал ключевых кадров"
-AMF.H264.KeyframeInterval.Description="Сколько секунд должен быть просадок кадров.\nТакже контролирует GOP Size."
-AMF.H264.IDRPeriod="Период IDR"
-AMF.H264.IDRPeriod.Description="Определяет расстояние между Мгновенными обновлениями декодирования (IDR) в кадрах. Так же контролирует размер последовательности GOP."
-AMF.H264.BFrame.Pattern="B-Кадры"
-AMF.H264.BFrame.Pattern.Description="Количество B-кадров использованное при кодировании.\nПоддерживается 2-й и 3-й VCE карт поколения. Негативно влияет на производительность кодирования."
-AMF.H264.BFrame.DeltaQP="QP дельта для B-Кадров"
-AMF.H264.BFrame.DeltaQP.Description="Значение дельты QP в последних I- или P-Кадров для нессылаемых B-Кадров."
-AMF.H264.BFrame.Reference="Ссылаемые B-Кадры"
-AMF.H264.BFrame.Reference.Description="Разрешить B-Кадру так же использовать B-Кадры как ссылки, вместо просто P- и I-Кадров."
-AMF.H264.BFrame.ReferenceDeltaQP="Дельта QP для ссылаемых B-кадров"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Значение дельты QP в последних I- или P-Кадров для ссылаемых B-Кадров."
-AMF.H264.DeblockingFilter="Фильтр деблокинга"
-AMF.H264.DeblockingFilter.Description="Устанавливает флаг, что декодер может использовать фильтр удаления блочности для прямой трансляции."
-AMF.H264.ScanType="Развертка"
-AMF.H264.ScanType.Description="Какой режим сканирования использовать; всегда оставляется этот параметр на 'прогрессивном'."
-AMF.H264.ScanType.Progressive="Прогрессивная"
-AMF.H264.ScanType.Interlaced="Чересстрочная"
-AMF.H264.MotionEstimation="Оценка движения"
-AMF.H264.MotionEstimation.Description="Оценки движения позволяет кодировщику необходимость уменьшить Битрейт, оценивая, где пиксель прошел."
-AMF.H264.MotionEstimation.None="Нет"
-AMF.H264.MotionEstimation.Half="Пол-пиксельная"
-AMF.H264.MotionEstimation.Quarter="Четверть-пиксельная"
-AMF.H264.MotionEstimation.Both="Пол- & Четверть-пиксельная"
-AMF.H264.CodingType="Тип кодирования"
-AMF.H264.CodingType.Description="Какой тип кодирования использовать:\n* \@AMF.Util.Default\@ позволяет решать AMF (рекомендуется).\n* CALVC (контекстно-Адаптивное с переменной длинной кодирования) - это быстрее, но больше.\n* CABAC (контекстно-Адаптивное Двоичное арифметическое кодирование) - это медленнее, но меньше."
+Utility.Default="По умолчанию"
+Utility.Automatic="Автоматически"
+Utility.Manual="Вручную"
+Utility.Switch.Disabled="Выключено"
+Utility.Switch.Enabled="Включено"
+Preset="Предустановка"
+Preset.ResetToDefaults="Сбросить на значения по умолчанию"
+Preset.Recording="Запись"
+Preset.HighQuality="Высокое качество"
+Preset.Indistinguishable="Незаметные потери"
+Preset.Lossless="Без потерь"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Использование"
+Usage.Description="На какой режим использования должен быть настроен AMF:\n- '\@Usage.Transcoding\@' — для перекодировки общего назначения (рекомендуется),\n- '\@Usage.UltraLowLatency\@' — для кодировки с очень низкой задержкой,\n- '\@Usage.LowLatency\@' похожа на пункт выше, но с немного большей задержкой.\nТрансляции поддерживают только '\@Usage.Transcoding\@', все остальные значения могут быть использованы для записи."
+Usage.Transcoding="Транскодирование"
+Usage.UltraLowLatency="Очень низкая задержка"
+Usage.LowLatency="Низкая задержка"
+Usage.Webcam="Вебкамера"
+QualityPreset="Предустановки качества"
+QualityPreset.Description="К какой предустановке качества AMF нужно стремиться:\n- '\@QualityPreset.Speed\@' самая быстрая, но низкого качества,\n- '\@QualityPreset.Balanced\@ сбалансированное сочетание обоих,\n- '\@QualityPreset.Quality\@ предоставляет наилучшее качество для заданного битрейта."
+QualityPreset.Speed="Скорость"
+QualityPreset.Balanced="Баланс"
+QualityPreset.Quality="Качество"
+Profile="Профиль"
+Profile.Description="Какой профиль использовать для кодирования. Сортировка от лучшей поддержки (сверху) до лучшего качества (снизу)."
+ProfileLevel="Уровень профиля"
+ProfileLevel.Description="Уровень используемого профиля. Лучше оставить \@Utility.Automatic\@."
+Tier="Уровень"
+Tier.Description="На каком Уровне кодировать. \"Высокий\" нацелен на высокий битрейт/использование пропускной способности, в то время как \"Основной\" нацелен на средства массовой информации."
+AspectRatio="Соотношение сторон"
+AspectRatio.Description="С каким соотношением сторон должен быть записан выходной файл."
+CodingType="Тип кодирования"
+CodingType.Description="Какой тип кодирования использовать:\n* '\@Utility.Automatic\@' позволяет решать AMF (рекомендуется).\n* 'CALVC' (контекстно-Адаптивное с переменной длинной кодирования) - это быстрее, но больше.\n* 'CABAC' (контекстно-Адаптивное Двоичное арифметическое кодирование) - это медленнее, но меньше."
+MaximumReferenceFrames="Максимум кадров-ссылок"
+MaximumReferenceFrames.Description="Количество кадров, используемых кодировщиком в качестве опорных. Имеет непосредственное влияние на качество кодирования."
+RateControlMethod="Метод кодирования"
+RateControlMethod.Description="Какой способ контроля битрейта должен быть использован:\n- '\@RateControlMethod.CQP\@' задает фиксированные значения QP I-/P-/B-кадра,\n- '\@RateControlMethod.CBR\@' придерживается целевого битрейта (используя данные заполнителя) (рекомендуется для стрима),\n-'\@RateControlMethod.VBR\@' держится ниже пикового битрейта,\n-'\@RateControlMethod.VBRLAT\@' держится близко к целевому битрейту, если задержка GPU и нагрузка позволяют, в противном случае будет использован более высокий битрейт (рекомендуется для стрима)."
+RateControlMethod.CQP="Постоянное качество (CQP)"
+RateControlMethod.CBR="Постоянный битрейт (CBR)"
+RateControlMethod.VBR="Переменный битрейт (Ограничение пиков) (VBR)"
+RateControlMethod.VBRLAT="Переменный битрейт (Ограничение задержки) (VBR)"
+PrePassMode="Режим пред-прохода"
+PrePassMode.Description="Пред-проход - это вторичный проход распространения битрейта, который позволяет лучше распространить битрейт по последовательности, однако эффект от этого может варьироваться от карты к карте."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (Четверть размера)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (Половина размера)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (Полный размер)"
+Bitrate.Target="Битрейт"
+Bitrate.Target.Description="Битрейт, желаемый для получения во всей последовательности."
+Bitrate.Peak="Пиковый битрейт"
+Bitrate.Peak.Description="Максимальный пиковый битрейт, желаемый для получения во всей последовательности."
+QP.IFrame="I-кадр QP"
+QP.IFrame.Description="Фиксированное значение QP используемое для I-Кадров."
+QP.PFrame="P-кадр QP"
+QP.PFrame.Description="Фиксированное значение QP используемое для P-Кадров."
+QP.BFrame="B-кадр QP"
+QP.BFrame.Description="Фиксированное значение QP используемое для B-Кадров."
+QP.Minimum="Минимальное QP"
+QP.Minimum.Description="Наименьшее значение QP для использования в Кадре."
+QP.IFrame.Minimum="Минимальный QP I-Кадра"
+QP.IFrame.Minimum.Description="Наименьшее значение QP используемое для I-Кадров."
+QP.PFrame.Minimum="Минимальный QP P-Кадра"
+QP.PFrame.Minimum.Description="Наименьшее значение QP, используемое для P-Кадров."
+QP.Maximum="Максимальное QP"
+QP.Maximum.Description="Наибольшее значение QP для использования в Кадре."
+QP.IFrame.Maximum="Максимальный QP I-Кадра"
+QP.IFrame.Maximum.Description="Наибольшее значение QP для использования в I-Кадре."
+QP.PFrame.Maximum="Максимальный QP P-Кадра"
+QP.PFrame.Maximum.Description="Наибольшее значение QP для использования в P-Кадре."
+FillerData="Данные наполнителя"
+FillerData.Description="Включение данных наполнителей позволяет кодировщику сохранять, по крайней мере, целевой Битрейт, заполняя оставшееся пространство в последовательности с пустой информации."
+FrameSkipping="Пропуск кадров"
+FrameSkipping.Description="Пропуск кадров позволяет кодировщику падение кадров в целях соответствия требованиям целевого битрейта.\nКогда в энкодере пропадает рамка вместо этого вставьте повторять-последний кадр NAL в стрим.\nМожет помочь с очень низким битрейтом."
+VBAQ="VBAQ"
+VBAQ.Description="Включить функцию \"Дисперсия на основе адаптивного квантования\" (VBAQ) которая основана на дисперсии пикселей для лучшего распространения битрейта.\nФункция работает на идее что зрительная система человека менее чувствительна к артефактам в высоко текстурированных областях и таким образом будет смещать битрейт к гладкой поверхности.\nВключение может привести к улучшению субъективного качества с определенным содержанием."
+EnforceHRD="Принудительно использовать HRD"
+EnforceHRD.Description="Принудительно использовать HRD (Hypothetical Reference Decoder), используемого для проверки выходного потока данных."
+VBVBuffer="Буфер VBV"
+VBVBuffer.Description="Какой метод следует использовать для определения Размера буфера VBV:\n- '\@Utlity.Automatic\@' рассчитывает размер ограничивающийся строгостью,\n- '\@Utlity.Manual\@' позволяет пользователю контролировать размер буфера.\nVBV (Верификатор буферизации видео) буфер используется некоторыми Методами управления скоростью, чтобы сохранить общий битрейт в пределах заданных ограничений."
+VBVBuffer.Strictness="Строгость буфера VBV"
+VBVBuffer.Strictness.Description="Определяет жесткость буфера VBV, при 100% будет максимально жестким и 0% будет неограниченным."
+VBVBuffer.Size="Размер буфера VBV"
+VBVBuffer.Size.Description="Размер буфера VBV, которая используется для управления скорости передачи данных в последовательности."
+VBVBuffer.InitialFullness="Изначальное заполнение буфера VBV"
+VBVBuffer.InitialFullness.Description="Изначальная степень заполнения VBV буфера (в %), будет влиять только на первоначальную последовательность кодирования."
+KeyframeInterval="Интервал ключевых кадров"
+KeyframeInterval.Description="Интервал (в секундах) между ключевыми кадрами."
+H264.IDRPeriod="IDR период (в кадрах)"
+H264.IDRPeriod.Description="Определяет расстояние между Мгновенными обновлениями декодирования (IDR) в кадрах. Так же контролирует размер последовательности GOP."
+H265.IDRPeriod="IDR период (в группах изображений)"
+H265.IDRPeriod.Description="Определяет расстояние между \"Мгновенными обновлениями декодирования\" (IDR) в GOP."
+GOP.Type="Тип GOP"
+GOP.Type.Description="Какой тип GOP должен быть использован:\n- '\@GOP.Type.Fixed\@' всегда будет использовать фиксированные расстояния между каждой GOP.\n- '\@GOP.Type.Variable\@' использует GOP различных размеров, в зависимости от того, какой нужен.\n'\@GOP.Type.Fixed\@' работает так же как устроен H264 и это лучшее решение для трансляции по локальной сети, в то время как '\@GOP.Type.Variable\@' лучше всего подходит для высококачественных записей низкого размера."
+GOP.Type.Fixed="Постоянный"
+GOP.Type.Variable="Переменный"
+GOP.Size="Размер GOP"
+GOP.Size.Description="Размер GOP (группы изображений) в кадрах."
+GOP.Size.Minimum="Минимальный размер GOP"
+GOP.Size.Minimum.Description="Минимальный размер GOP (группы изображений) в кадрах."
+GOP.Size.Maximum="Максимальный размер GOP"
+GOP.Size.Maximum.Description="Максимальный размер GOP (группы изображений) в кадрах."
+GOP.Alignment="Выравнивание GOP"
+GOP.Alignment.Description="Экспериментально, последствия неизвестны. Используйте на свой страх и риск."
+BFrame.Pattern="Структура B-Кадров"
+BFrame.Pattern.Description="Количество B-кадров, используемых при кодировании.\nПоддерживается 2-м и 3-м поколением VCE карт. Негативно влияет на производительность кодирования."
+BFrame.DeltaQP="Отклонение QP B-Кадров"
+BFrame.DeltaQP.Description="Отклонение QP для не опорных B-Кадров, по отношению к последнему I- или P-Кадру."
+BFrame.Reference="Опорный B-Кадр"
+BFrame.Reference.Description="Разрешить B-Кадру так же использовать B-Кадры как ссылки, вместо просто P- и I-Кадров."
+BFrame.ReferenceDeltaQP="Отклонение QP опорных B-кадров"
+BFrame.ReferenceDeltaQP.Description="Значение дельты QP в последних I- или P-Кадрах для ссылаемых B-Кадров."
+DeblockingFilter="Фильтр деблокинга"
+DeblockingFilter.Description="Разрешить декодеру использовать деблокинг-фильтр."
+MotionEstimation="Оценка движения"
+MotionEstimation.Description="Оценка движения позволяет кодировщику снизить битрейт, оценивая, куда переместился пиксель."
+MotionEstimation.Quarter="Четверть-пиксельная"
+MotionEstimation.Half="Пол-пиксельная"
+MotionEstimation.Full="Четверть- и пол-пиксельная"
+Video.API="Video API"
+Video.API.Description="Какой API должен использовать бэкенд?"
+Video.Adapter="Видеоадаптер"
+Video.Adapter.Description="На каком Адаптере мы должны пытаться кодировать?"
+OpenCL="OpenCL"
+OpenCL.Description="Должен ли использоваться OpenCL для подтверждения Кадров? Технически быстрее, но вызывает проблемы с драйверами Intel (в связи с несовместимыми библиотеками OpenCL)."
+View="Режим просмотра"
+View.Description="Какие параметры должны быть показаны?\nВыбор '\@View.Master\@' лишит вас возможности получения помощи."
+View.Basic="Обычный"
+View.Advanced="Расширенный"
+View.Expert="Эксперт"
+View.Master="Мастер"
+Debug="Отладка"
+Debug.Description="Включить дополнительные отладочные сообщения. Необходим запуск Open Broadcaster Software Studio с ключами '--verbose --log_unfiltered' (убрать ')."
AMF.H264.MaximumLTRFrames="Максимум LTR-кадров"
AMF.H264.MaximumLTRFrames.Description="Long Term Reference (LTR) Frames - функция позволяющая кодировщику помечать определенные кадры в последовательности как ссылаемые.\nLTR Frames не может использоваться B-кадры и энкодер будет отключать B-кадры, если они используются."
AMF.H264.MaximumAccessUnitSize="Максимальный Размер Блока Доступа"
AMF.H264.HeaderInsertionSpacing.Description="Сколько кадров должно быть между заголовками NAL. Не рекомендуется менять значение с 0 (автоматически)."
AMF.H264.WaitForTask="Дождитесь Задач"
AMF.H264.WaitForTask.Description="Неизвестно, Экспериментально"
-AMF.H264.PreAnalysisPass="Проход пред-анализа"
-AMF.H264.PreAnalysisPass.Description="Неизвестно, Экспериментально"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Неизвестно, Экспериментально"
-AMF.H264.GOPSize="Размер GOP"
-AMF.H264.GOPSize.Description="Неизвестно, Экспериментально"
-AMF.H264.GOPAlignment="Выравнивание GOP"
-AMF.H264.GOPAlignment.Description="Неизвестно, Экспериментально"
-AMF.H264.MaximumReferenceFrames="Максимум кадров-ссылок"
-AMF.H264.MaximumReferenceFrames.Description="Неизвестно, Экспериментально"
AMF.H264.SlicesPerFrame="Количество частей на кадр"
AMF.H264.SlicesPerFrame.Description="Сколько кусков I-Кадров должно быть сохранено в каждом кадре?\nЗначение 0 позволяет кодировщику выбирать \"на лету\".\nКодирование Intra-Refresh используется для быстрого воспроизведения и поиска при перемотке."
AMF.H264.SliceMode="Режим Slice"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Неизвестно, Экспериментально"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Количество Intra-Refresh Макроблоков на слот"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Сколько Макроблоков должно быть сохранено в каждом слоте?\nЗначение 0 отключает эту функцию.\nКодирование Intra-Refresh используется для быстрого воспроизведения и поиска при перемотке."
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAPI.Description="Какой API использовать для кодирования."
-AMF.H264.VideoAdapter="Видеоадаптер"
-AMF.H264.VideoAdapter.Description="Какой видеоадаптер использовать для кодирования."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Должен ли кодировщик использовать OpenCL для подтверждения индивидуальных кадров?"
-AMF.H264.View="Режим просмотра"
-AMF.H264.View.Description="Какие параметры должны быть видны. Вы не будете получать поддержку при использовании режимов 'Эксперт' и 'Мастер'."
-AMF.H264.View.Basic="Обычный"
-AMF.H264.View.Advanced="Расширенный"
-AMF.H264.View.Expert="Эксперт"
-AMF.H264.View.Master="Мастер"
-AMF.H264.Debug="Отладка"
-AMF.H264.Debug.Description="Включить дополнительные логи для отладки; следует включить, если вам нужна поддержка с этим кодировщиком."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/sr-CS.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/sr-CS.ini
Changed
-AMF.Util.Default="Podrazumevano"
-AMF.Util.Automatic="Automatski"
-AMF.Util.Manual="Ručno"
-AMF.Util.Toggle.Disabled="Onemogućeno"
-AMF.Util.Toggle.Enabled="Omogućeno"
-AMF.H264.Preset="Šablon"
-AMF.H264.Preset.ResetToDefaults="Vrati na podrazumevane vrednosti"
-AMF.H264.Preset.Recording="Snimanje"
-AMF.H264.Preset.HighQuality="Visoki kvalitet"
-AMF.H264.Preset.Indistinguishable="Istovetno"
-AMF.H264.Preset.Lossless="Bez gubitaka"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Upotreba"
-AMF.H264.Usage.Description="Za koju upotrebu bi AMF trebao biti podešen:\n- '\@AMF.H264.Usage.Transcoding\@' je transkoding opšte namene (preporučeno),\n- '\@AMF.H264.Usage.UltraLowLatency\@' je za enkoding veoma niskog kašnjenja,\n- '\@AMF.H264.Usage.LowLatency\@' je slično prethodnom sa malo višim kašnjenjem.\nEmitovanje podržava samo '\@AMF.H264.Usage.Transcoding\@', sve ostale vrednosti se mogu koristiti za snimanje."
-AMF.H264.Usage.Transcoding="Transkodiranje"
-AMF.H264.Usage.UltraLowLatency="Ultra nisko kašnjenje"
-AMF.H264.Usage.LowLatency="Nisko kašnjenje"
-AMF.H264.QualityPreset="Šablon kvaliteta"
-AMF.H264.QualityPreset.Description="Koji šablon za kvalitet bi AMF trebao da cilja:\n- '\@AMF.H264.QualityPreset.Speed\@' je najbrži sa najlošijim kvalitetom,\n- '\@AMF.H264.QualityPreset.Balanced\@' je balansiran spoj oba,\n- '\@AMF.H264.QualityPreset.Quality\@' daje najbolji kvalitet za zadati bitrejt."
-AMF.H264.QualityPreset.Speed="Brzina"
-AMF.H264.QualityPreset.Balanced="Izbalansirano"
-AMF.H264.QualityPreset.Quality="Kvalitet"
-AMF.H264.Profile="Profil"
-AMF.H264.Profile.Description="Koji H.264 profil koristiti za enkodiranje:\n- 'Baseline' ima najveću podršku platformi,\n- 'Main' je podržan na starijim uređajima (preporučeno ako se ide ka mobilnim uređajima),\n- 'High' je podržan na aktuelnim uređajima (preporučeno)."
-AMF.H264.ProfileLevel="Nivo profila"
-AMF.H264.RateControlMethod="Metoda kontrole protoka"
-AMF.H264.RateControlMethod.CQP="Konstantan kvalitet (CQP)"
-AMF.H264.RateControlMethod.CBR="Konstantan protok (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Promenjivi protok (sa gornjom granicom) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Promenjivi protok (sa granicom kašnjenja) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Ciljani protok"
-AMF.H264.Bitrate.Peak="Granica protoka"
-AMF.H264.QP.Minimum="Minimalni QP"
-AMF.H264.QP.Maximum="Maksimalni QP"
-AMF.H264.QP.IFrame="I-Frejm QP"
-AMF.H264.QP.PFrame="P-Frejm QP"
-AMF.H264.QP.BFrame="B-Frejm QP"
-AMF.H264.FillerData="Podaci za popunjavanje"
-AMF.H264.FrameSkipping="Preskakanje frejmova"
-AMF.H264.EnforceHRDCompatibility="Prisilna HRD kompatibilnost"
-AMF.H264.DeblockingFilter="Odblokirajući filter"
-AMF.H264.ScanType="Vrsta skeniranja"
-AMF.H264.ScanType.Progressive="Progresivno"
-AMF.H264.ScanType.Interlaced="Isprekidano"
AMF.H264.MaximumLTRFrames="Maksimalan broj LTR frejmova"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/sv-SE.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/sv-SE.ini
Changed
-AMF.Util.Default="Standard"
-AMF.Util.Automatic="Automatisk"
-AMF.Util.Manual="Manuell"
-AMF.Util.Toggle.Disabled="Inaktiverad"
-AMF.Util.Toggle.Enabled="Aktiverad"
-AMF.H264.Preset="Förinställning"
-AMF.H264.Preset.ResetToDefaults="Återställ till standardvärden"
-AMF.H264.Preset.Recording="Spelar in"
-AMF.H264.Preset.HighQuality="Hög kvalitet"
-AMF.H264.Preset.Indistinguishable="Oskiljbar"
-AMF.H264.Preset.Lossless="Förlustfri"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Användning"
-AMF.H264.Usage.Description="Vilken användning AMF borde vara inställd på:\n- '\@AMF.H264.Usage.Transcoding\@' är för allmän omkodning (rekommenderas),\n- '\@AMF.H264.Usage.UltraLowLatency\@' är för omkodning med riktigt låg latens,\n- '\@AMF.H264.Usage.LowLatency\@' liknar ovanstående men med lite högre latens.\nStrömning stöder endast '\@AMF.H264.Usage.Transcoding\@', alla andra värden kan användas för inspelning."
-AMF.H264.Usage.Transcoding="Omkodning"
-AMF.H264.Usage.UltraLowLatency="Ultralåg latens"
-AMF.H264.Usage.LowLatency="Låg latens"
-AMF.H264.QualityPreset="Kvalitetsförinställning"
-AMF.H264.QualityPreset.Description="Vilken kvalitetsmall AMF bör försöka att uppnå:\n- '\@AMF.H264.QualityPreset.Speed\@' är den snabbaste men har den sämsta kvaliteten,\n- '\@AMF.H264.QualityPreset.Balanced\@' är en balanserad mix av båda,\n- '\@AMF.H264.QualityPreset.Quality\@' ger den bästa kvaliteten för en angiven bithastighet."
-AMF.H264.QualityPreset.Speed="Hastighet"
-AMF.H264.QualityPreset.Balanced="Balanserad"
-AMF.H264.QualityPreset.Quality="Kvalitet"
-AMF.H264.Profile="Profil"
-AMF.H264.Profile.Description="Vilken H.264-profil att använda för kodning:\n- 'Baseline' stödjer flest plattformar,\n- 'Main' stöds av äldre enheter (rekommenderas för mobila enheter),\n- 'High' stöds av aktuella enheter (rekommenderas)."
-AMF.H264.ProfileLevel="Profilnivå"
-AMF.H264.ProfileLevel.Description="Vilken H.264-profilnivå att använda för kodning:\n- '\@AMF.Util.Automatic\@' beräknar den bästa profilnivån för den angivna bildfrekvensen och bildstorleken,\n- '4.1' stöder 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2' stöder 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0' stöder 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1' stöder 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2' stöder 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod.CQP="Konstant QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Konstant bithastighet (CBR)"
-AMF.H264.Bitrate.Target="Målets bithastighet"
-AMF.H264.Bitrate.Target.Description="Bithastighet att försöka uppnå i den övergripande sekvensen."
-AMF.H264.Bitrate.Peak="Maximal bithastighet"
-AMF.H264.QP.Minimum="Minimal QP"
-AMF.H264.QP.Minimum.Description="Lägsta QP-värde att använda i en bildruta."
-AMF.H264.QP.Maximum="Maximal QP"
-AMF.H264.QP.Maximum.Description="Högsta QP-värde att använda i en bildruta."
-AMF.H264.QP.IFrame="QP för I-bildrutor"
-AMF.H264.QP.IFrame.Description="Konstant QP-värde att använda för I-bildrutor."
-AMF.H264.QP.PFrame="QP för P-bildrutor"
-AMF.H264.QP.PFrame.Description="Konstant QP-värde att använda för P-bildrutor."
-AMF.H264.QP.BFrame="QP för B-bildrutor"
-AMF.H264.QP.BFrame.Description="Konstant QP-värde att använda för B-bildrutor."
-AMF.H264.VBVBuffer="VBV-buffert"
-AMF.H264.VBVBuffer.Description="Vilken metod som ska användas för att bestämma VBV-buffertens storlek:\n- '\@AMF.Util.Automatic\@' beräknar storleken med hjälp av en strikt begränsning,\n- '\@AMF.Util.Manual\@' låter användaren kontrollera storleken.\nVBV-buffertern (Video Buffering Verifier) används av vissa Rate Control Methods för att hålla den övergripande bithastigheten inom de angivna begränsningarna."
-AMF.H264.VBVBuffer.Strictness="Strikthet för VBV-buffert"
-AMF.H264.VBVBuffer.Strictness.Description="Bestämmer noggrannheten för VBV-bufferten, där 100% är så noggrann som möjligt och 0% är gränslös."
-AMF.H264.VBVBuffer.Size="VBV-buffertstorlek"
-AMF.H264.FillerData="Fyllningsdata"
-AMF.H264.FillerData.Description="Fyllningsdata låter kodaren behålla önskad bithastighet genom att fylla upp det återstående utrymmet i en sekvens med tom information."
-AMF.H264.FrameSkipping="Hoppa över bildrutor"
-AMF.H264.EnforceHRDCompatibility="Tvinga HRD-kompatibilitet"
-AMF.H264.KeyframeInterval="Intervall för keyframes"
-AMF.H264.KeyframeInterval.Description="Definierar avståndet mellan keyframes i sekunder. Kontrollerar även GOP-sekvensens storlek."
-AMF.H264.IDRPeriod="IDR-period"
-AMF.H264.IDRPeriod.Description="Definierar avståndet mellan Instantaneous Decoding Refreshes (IDR) i bildrutor. Kontrollerar även GOP-sekvensens storlek."
-AMF.H264.BFrame.Pattern="B-bildrutor"
-AMF.H264.DeblockingFilter="Avblockningsfilter"
-AMF.H264.ScanType="Typ av skanning"
-AMF.H264.ScanType.Description="Vilken skanningsmetod att använda, lämna alltid detta på '\@AMF.H264.ScanType.Progressive\@'."
-AMF.H264.ScanType.Progressive="Progressiv"
-AMF.H264.ScanType.Interlaced="Sammanflätad"
-AMF.H264.MotionEstimation="Rörelseuppskattning"
-AMF.H264.MotionEstimation.Description="Rörelseuppskattning låter kodaren reducera nödvändig bithastighet genom att uppskatta var en bildpunkt förflyttas."
-AMF.H264.MotionEstimation.None="Ingen"
-AMF.H264.MotionEstimation.Half="Halv bildpunkt"
-AMF.H264.MotionEstimation.Quarter="Fjärdedels bildpunkt"
-AMF.H264.MotionEstimation.Both="Halv och fjärdedels bildpunkt"
-AMF.H264.CodingType="Kodningstyp"
-AMF.H264.CodingType.Description="Vilken typ av kodning som ska användas:\n* \@AMF.Util.Default\@ låter AMF bestämma (rekommenderas).\n* CALVC (Context-Adaptive Variable-Length Coding) är snabbare, men större.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) är långsammare, men mindre."
+Utility.Default="Standard"
+Utility.Automatic="Automatisk"
+Utility.Manual="Manuell"
+Utility.Switch.Disabled="Inaktiverad"
+Utility.Switch.Enabled="Aktiverad"
+Preset="Förinställning"
+Preset.ResetToDefaults="Återställ till standardvärden"
+Preset.Recording="Spelar in"
+Preset.HighQuality="Hög kvalitet"
+Preset.Indistinguishable="Oskiljbar"
+Preset.Lossless="Förlustfritt"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Användning"
+Usage.Description="Vilken användning AMF borde vara inställd på:\n- '\@Usage.Transcoding\@' är för allmän omkodning (rekommenderas),\n- '\@Usage.UltraLowLatency\@' är för omkodning med riktigt låg latens,\n- '\@Usage.LowLatency\@' liknar ovanstående men med lite högre latens.\nStrömning stöder endast '\@Usage.Transcoding\@', alla andra värden kan användas för inspelning."
+Usage.Transcoding="Omkodning"
+Usage.UltraLowLatency="Ultralåg latens"
+Usage.LowLatency="Låg latens"
+Usage.Webcam="Webbkamera"
+QualityPreset="Kvalitetsförinställning"
+QualityPreset.Description="Vilken kvalitetsmall AMF bör försöka att uppnå:\n- '\@QualityPreset.Speed\@' är den snabbaste men har den sämsta kvaliteten,\n- '\@QualityPreset.Balanced\@' är en balanserad mix av båda,\n- '\@QualityPreset.Quality\@' ger den bästa kvaliteten för en angiven bithastighet."
+QualityPreset.Speed="Hastighet"
+QualityPreset.Balanced="Balanserad"
+QualityPreset.Quality="Kvalitet"
+Profile="Profil"
+Profile.Description="Vilken profil att använda för kodning, sorteras från mest utbrett stöd till högsta kvalitet."
+ProfileLevel="Profilnivå"
+ProfileLevel.Description="Vilken profilnivå att använda för kodning, det är bäst att lämna detta på \@Utility.Automatic\@"
+Tier="Nivå"
+Tier.Description="Vilken nivå att koda på. 'Hög' siktar på hög bithastighet/bandbredd använder medan 'Huvud' siktar på mainstreammedia."
+AspectRatio="Aspektförhållande"
+AspectRatio.Description="Vilket bildförhållande som bör skrivas till utmatningsfilen."
+CodingType="Kodningstyp"
+CodingType.Description="Vilken typ av kodning som ska användas:\n* '\@Utility.Automatic\@' låter AMF bestämma (rekommenderas).\n* 'CALVC' (Context-Adaptive Variable-Length Coding) är snabbare, men större.\n* 'CABAC' (Context-Adaptive Binary Arithmetic Coding) är långsammare, men mindre."
+MaximumReferenceFrames="Maximalt antal referensbildrutor"
+MaximumReferenceFrames.Description="Hur många bildrutor kodaren kan referera som mest under kodning, påverkar kodningskvalitet avsevärt."
+RateControlMethod.CQP="Konstant QP (CQP)"
+RateControlMethod.CBR="Konstant bithastighet (CBR)"
+RateControlMethod.VBR="Varierande bithastighet (begränsat maxvärde) (VBR_LAT)"
+RateControlMethod.VBRLAT="Varierande bithastighet (begränsad latens) (VBR_LAT)"
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (kvarts storlek)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (halv storlek)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (full storlek)"
+Bitrate.Target="Målbithastighet"
+Bitrate.Target.Description="Bithastighet att försöka uppnå i den övergripande sekvensen."
+Bitrate.Peak="Maximal bithastighet"
+Bitrate.Peak.Description="Bithastighet att maximalt försöka uppnå i den övergripande sekvensen."
+QP.IFrame="QP för I-bildrutor"
+QP.IFrame.Description="Konstant QP-värde att använda för I-bildrutor."
+QP.PFrame="QP för P-bildrutor"
+QP.PFrame.Description="Konstant QP-värde att använda för P-bildrutor."
+QP.BFrame="QP för B-bildrutor"
+QP.BFrame.Description="Konstant QP-värde att använda för B-bildrutor."
+QP.Minimum="Minimal QP"
+QP.Minimum.Description="Lägsta QP-värde att använda i en bildruta."
+QP.IFrame.Minimum.Description="Lägsta QP-värde att använda i en I-bildruta."
+QP.PFrame.Minimum.Description="Lägsta QP-värde att använda i en P-bildruta."
+QP.Maximum="Maximal QP"
+QP.Maximum.Description="Högsta QP-värde att använda i en bildruta."
+QP.IFrame.Maximum.Description="Högsta QP-värde att använda i en I-bildruta."
+QP.PFrame.Maximum.Description="Högsta QP-värde att använda i en P-bildruta."
+FillerData="Fyllningsdata"
+FillerData.Description="Fyllningsdata låter kodaren behålla minst \@Bitrate.Target\@ genom att fylla upp det återstående utrymmet i en sekvens med tom information."
+FrameSkipping="Hoppa över bildrutor"
+VBAQ="VBAQ"
+EnforceHRD="Tvinga HRD"
+VBVBuffer="VBV-buffert"
+VBVBuffer.Description="Vilken metod som ska användas för att bestämma VBV-buffertens storlek:\n- '\@Utlity.Automatic\@' beräknar storleken med hjälp av en strikt begränsning,\n- '\@Utlity.Manual\@' låter användaren kontrollera storleken.\nVBV-buffertern (Video Buffering Verifier) används av vissa Rate Control Methods för att hålla den övergripande bithastigheten inom de angivna begränsningarna."
+VBVBuffer.Strictness="Noggrannhet för VBV-buffert"
+VBVBuffer.Strictness.Description="Bestämmer noggrannheten för VBV-bufferten, där 100% är så noggrann som möjligt och 0% är gränslös."
+VBVBuffer.Size="VBV-buffertstorlek"
+VBVBuffer.Size.Description="Storleken för VBV-bufferten som används för bithastighetskontroll i en sekvens."
+VBVBuffer.InitialFullness.Description="Hur full VBV-bufferten är från början (i %), kommer endast påverka den inledande kodningssekvensen."
+KeyframeInterval="Intervall för keyframes"
+KeyframeInterval.Description="Intervall (i sekunder) mellan keyframes."
+H264.IDRPeriod="IDR-period (i bildrutor)"
+H264.IDRPeriod.Description="Definierar avståndet mellan Instantaneous Decoding Refreshes (IDR) i bildrutor. Kontrollerar även GOP-sekvensens storlek."
+H265.IDRPeriod="IDR-period (i GOP)"
+H265.IDRPeriod.Description="Definierar avståndet mellan Instantaneous Decoding Refreshes (IDR) i GOP."
+GOP.Type="GOP-typ"
+GOP.Type.Description="Vilken typ av GOP bör användas:\n- '\@GOP.Type.Fixed\@' kommer alltid använda fasta avstånd mellan varje GOP.\n- '\@GOP.Type.Variable\@' tillåter GOP i olika storlekar, beroende på vad som behövs.\n'\@GOP.Type.Fixed\@' är hur H264-implementeringen fungerar och är bäst för lokal nätverksströmning, medan '\@GOP.Type.Variable\@' är bäst för korta inspelningar i hög kvalitet."
+GOP.Type.Fixed="Konstant"
+GOP.Type.Variable="Variabel"
+GOP.Size="GOP-storlek"
+GOP.Size.Description="Storleken för en GOP (Group Of Pictures) i bildrutor."
+GOP.Size.Minimum="Minimal GOP-storlek"
+GOP.Size.Minimum.Description="Minimal storlek för en GOP (Group Of Pictures) i bildrutor."
+GOP.Size.Maximum="Maximal GOP-storlek"
+GOP.Size.Maximum.Description="Maximal storlek för en GOP (Group Of Pictures) i bildrutor."
+GOP.Alignment="GOP-justering"
+GOP.Alignment.Description="Experimentell. Bieffekterna är okända. Använd på egen risk."
+BFrame.Pattern.Description="Antalet B-bildrutor att använda under kodning.\nStöds av andra och tredje generationens VCE-kort. Påverkar kodningsprestandan negativt."
+BFrame.DeltaQP.Description="Delta QP-värde till den sista I- eller P-bildrutan för icke-referensbara B-bildrutor."
+BFrame.Reference.Description="Låt en B-bildruta också använda B-bildrutor som referens, i stället för bara P- och I-bildrutor."
+BFrame.ReferenceDeltaQP.Description="Delta QP-värde till den sista I- eller P-bildrutan för referensbara B-bildrutor."
+DeblockingFilter="Avblockningsfilter"
+DeblockingFilter.Description="Låter kodaren tillämpa ett avblockeringsfilter."
+MotionEstimation="Rörelseuppskattning"
+MotionEstimation.Description="Rörelseuppskattning låter kodaren reducera nödvändig bithastighet genom att uppskatta var en bildpunkt förflyttas."
+MotionEstimation.Quarter="Fjärdedels bildpunkt"
+MotionEstimation.Half="Halv bildpunkt"
+MotionEstimation.Full="Fjärdedels och halv bildpunkt"
+Video.API="Video-API"
+Video.API.Description="Vilket API bör användas av backend?"
+Video.Adapter="Grafikkort"
+Video.Adapter.Description="På vilken adapter bör vi försöka koda på?"
+OpenCL="OpenCL"
+View="Visningsläge"
+View.Description="Vilka egenskaper bör visas?\nOm '\@View.Master\@' används kommer du inte få någon hjälp."
+View.Basic="Grundläggande"
+View.Advanced="Avancerat"
+View.Expert="Expert"
+View.Master="Mästare"
+Debug="Felsök"
+Debug.Description="Aktivera ytterligare felsökningsmeddelanden. Kräver att du kör Open Broadcaster Software Studio med kommandoraden \"--verbose --log_unfiltered\" (utan citationstecken)."
AMF.H264.MaximumLTRFrames="Maximalt antal LTR-bildrutor"
+AMF.H264.MaximumAccessUnitSize="Maximal storlek på åtkomstenhet"
+AMF.H264.MaximumAccessUnitSize.Description="Största storleken på en åtkomstenhet för en NAL."
AMF.H264.HeaderInsertionSpacing.Description="Hur många bildrutor som borde vara mellan NAL-headers."
AMF.H264.WaitForTask="Väntar på arbetsuppgifter"
AMF.H264.WaitForTask.Description="Okänd, experimentell"
-AMF.H264.PreAnalysisPass.Description="Okänd, experimentell"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Okänd, experimentell"
-AMF.H264.GOPSize="GOP-storlek"
-AMF.H264.GOPSize.Description="Okänd, experimentell"
-AMF.H264.GOPAlignment="GOP-justering"
-AMF.H264.GOPAlignment.Description="Okänd, experimentell"
-AMF.H264.MaximumReferenceFrames.Description="Okänd, experimentell"
AMF.H264.SliceMode.Description="Okänd, experimentell"
AMF.H264.MaximumSliceSize.Description="Okänd, experimentell"
AMF.H264.SliceControlMode.Description="Okänd, experimentell"
AMF.H264.SliceControlSize.Description="Okänd, experimentell"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Okänd, experimentell"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Hur många Macroblock bör lagras i varje plats?\nEtt värdet på 0 inaktiverar denna funktion.\nIntra-uppdaterade kodning används för snabbare uppspelning och sökning."
-AMF.H264.VideoAPI="Video-API"
-AMF.H264.VideoAPI.Description="Vilken API att använda för kodning."
-AMF.H264.VideoAdapter="Grafikkort"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Bör kodaren använda OpenCL för att skicka de individuella bildrutorna?"
-AMF.H264.View="Visningsläge"
-AMF.H264.View.Basic="Grundläggande"
-AMF.H264.View.Advanced="Avancerad"
-AMF.H264.View.Expert="Expert"
-AMF.H264.View.Master="Master"
-AMF.H264.Debug="Felsök"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/tr-TR.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/tr-TR.ini
Changed
-AMF.Util.Default="Varsayılan"
-AMF.Util.Automatic="Otomatik"
-AMF.Util.Manual="Elle"
-AMF.Util.Toggle.Disabled="Devre Dışı"
-AMF.Util.Toggle.Enabled="Etkin"
-AMF.H264.Preset="Ön Tanımlı"
-AMF.H264.Preset.ResetToDefaults="Varsayılan Ayarlara Geri Dön"
-AMF.H264.Preset.Recording="Kaydediliyor"
-AMF.H264.Preset.HighQuality="Yüksek Kalite"
-AMF.H264.Preset.Lossless="Kayıpsız"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Kullanım"
-AMF.H264.Usage.UltraLowLatency="Ultra Düşük Gecikme"
-AMF.H264.Usage.LowLatency="Düşük Gecikme"
-AMF.H264.QualityPreset.Speed="Hız"
-AMF.H264.QualityPreset.Balanced="Dengeli"
-AMF.H264.QualityPreset.Quality="Kalite"
-AMF.H264.Profile="Profil"
-AMF.H264.ProfileLevel="Profil Seviyesi"
-AMF.H264.Bitrate.Target="Hedef Bit Hızı"
-AMF.H264.QP.Minimum="Minimum QP"
-AMF.H264.QP.Maximum="Maksimum QP"
-AMF.H264.VBVBuffer.Size="VBV Arabellek Boyutu"
-AMF.H264.FrameSkipping="Kare Atlama"
-AMF.H264.ScanType="Tarama Türü"
-AMF.H264.MotionEstimation.None="Hiçbiri"
-AMF.H264.MotionEstimation.Half="Yarım Piksel"
-AMF.H264.MotionEstimation.Quarter="Çeyrek Piksel"
-AMF.H264.MotionEstimation.Both="Yarım ve Çeyrek Piksel"
AMF.H264.WaitForTask.Description="Bilinmeyen, Deneysel"
-AMF.H264.PreAnalysisPass.Description="Bilinmeyen, Deneysel"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Bilinmeyen, Deneysel"
-AMF.H264.GOPSize.Description="Bilinmeyen, Deneysel"
-AMF.H264.GOPAlignment.Description="Bilinmeyen, Deneysel"
-AMF.H264.MaximumReferenceFrames.Description="Bilinmeyen, Deneysel"
AMF.H264.SliceMode.Description="Bilinmeyen, Deneysel"
AMF.H264.MaximumSliceSize.Description="Bilinmeyen, Deneysel"
AMF.H264.SliceControlMode.Description="Bilinmeyen, Deneysel"
AMF.H264.SliceControlSize.Description="Bilinmeyen, Deneysel"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Bilinmeyen, Deneysel"
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAdapter="Ekran Kartı"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.View="Görüntüleme Modu"
-AMF.H264.View.Basic="Temel"
-AMF.H264.View.Advanced="Gelişmiş"
-AMF.H264.View.Expert="Uzman"
-AMF.H264.View.Master="Usta"
-AMF.H264.Debug="Hata Ayıklama"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/uk-UA.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/uk-UA.ini
Changed
-AMF.Util.Default="За замовчанням"
-AMF.Util.Automatic="Автоматично"
-AMF.Util.Manual="Вручну"
-AMF.Util.Toggle.Disabled="Вимкнено"
-AMF.Util.Toggle.Enabled="Увімкнено"
-AMF.H264.Preset="Шаблон"
-AMF.H264.Preset.ResetToDefaults="Відновити параметри за замовчанням"
-AMF.H264.Preset.Recording="Звичайний запис"
-AMF.H264.Preset.HighQuality="Висока якість"
-AMF.H264.Preset.Indistinguishable="Якість майже без втрат"
-AMF.H264.Preset.Lossless="Без втрат якості"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Використання"
-AMF.H264.Usage.Description="Тип використання AMF або напрямок його використання, тобто:\n 'Для перекодування' - загальний тип налаштувань (рекомендований).\n 'З ультра-низькою затримкою' - тип лише для дійсно низької затримки під час кодування.\n 'З низькою затримкою' - схожий на зазначений вище, але з трохи більшою затримкою.\n\nТрансляції підтримують лише напрямок використання 'Для перекодування'."
-AMF.H264.Usage.Transcoding="Для перекодування"
-AMF.H264.Usage.UltraLowLatency="З ультра-низькою затримкою"
-AMF.H264.Usage.LowLatency="З низькою затримкою"
-AMF.H264.QualityPreset="Шаблон якості"
-AMF.H264.QualityPreset.Description="Визначає який Шаблон якості AMF намагатиметься встановити:\n 'Швидкість' - є найшвидшим, але має гіршу якість відео.\n 'Збалансований' - це баланс між шаблонами Швидкість та Якість.\n 'Якість' - дає найкращу якість відео для заданого бітрейту."
-AMF.H264.QualityPreset.Speed="Швидкість"
-AMF.H264.QualityPreset.Balanced="Збалансований"
-AMF.H264.QualityPreset.Quality="Якість"
-AMF.H264.Profile="Профіль"
-AMF.H264.Profile.Description="Зазначений профіль H.264 буде використано енкодером:\n 'Baseline' - найпоширеніший серед пристроїв відтворення\n 'Main' - майже всі пристрої відтворення мають підтримку цього профілю\n 'High' - профіль підтримується найсучаснішими пристроями (рекомендується)."
-AMF.H264.ProfileLevel="Рівень профілю"
-AMF.H264.ProfileLevel.Description="Визначає рівень профілю H.264 який буде використано під час кодування:\n 'Автоматично' - програма сама встановить найкращій рівень профілю згідно розміру та частоти кадрів,\n '4.1' - підтримує 1920x1080 30 кадрів/с, 1280x720 60 кадрів/с, 960x540 90 кадрів/с\n '4.2' - підтримує 1920x1080 60 кадрів/с, 1280x720 120 кадрів/с, 960x540 172 кадрів/с\n '5.0' - підтримує 1920x1080 60 кадрів/с, 1280x720 144 кадрів/с, 960x540 172 кадрів/с\n '5.1' - підтримує 3840x2160 30 кадрів/с, 1920x1080 120 кадрів/с, 1280x720 172 кадрів/с, 960x540 172 кадрів/с\n '5.2' - підтримує 3840x2160 60 кадрів/с, 1920x1080 172 кадрів/с, 1280x720 172 кадрів/с, 960x540 172 кадрів/с"
-AMF.H264.RateControlMethod="Метод керування потоком"
-AMF.H264.RateControlMethod.Description="Визначає метод керування потоком:\n 'Фіксований QP (CQP)' - встановлює фіксовані QP для I-/P-/B-кадрів,\n 'Постійний бітрейт (CBR)' - дотримується значення Бажаний бітрейт потоку (використовуючи\nопцію Заповнювати пустоти у бітрейту) (рекомендується для трансляцій),\n 'Змінний бітрейт (максимальний бітрейт обмежено) (VBR)' - потік завжди залишається нижче значення Максимальний бітрейт,\n 'Змінний бітрейт (затримку обмежено) (VBR_LAT)' - потік буде збережено близько до значення Бажаний бітрейт,\nдопоки затримка графічного адаптеру та навантаження на нього дозволятимуть це,\nв іншому випадку бітрейт буде збільшено (рекомендується для запису).\n\nQP - Quantization Parameter (параметр квантування)."
-AMF.H264.RateControlMethod.CQP="Фіксований QP (CQP)"
-AMF.H264.RateControlMethod.CBR="Постійний бітрейт (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Змінний бітрейт (максимальний бітрейт обмежено) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Змінний бітрейт (затримку обмежено) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Бажаний бітрейт"
-AMF.H264.Bitrate.Target.Description="Енкодер буде намагатися дотримуватись цього бітрейту."
-AMF.H264.Bitrate.Peak="Максимальний бітрейт"
-AMF.H264.Bitrate.Peak.Description="Енкодер буде намагатися не перевищувати цього бітрейту."
-AMF.H264.QP.Minimum="Мінімальний QP"
-AMF.H264.QP.Minimum.Description="Найнижчий QP (параметр квантування) в кадрі."
-AMF.H264.QP.Maximum="Максимальний QP"
-AMF.H264.QP.Maximum.Description="Найвищий QP (параметр квантування) в кадрі."
-AMF.H264.QP.IFrame="QP для I-кадрів"
-AMF.H264.QP.IFrame.Description="Фіксоване значення QP (параметр квантування) для I-кадрів."
-AMF.H264.QP.PFrame="QP для P-кадрів"
-AMF.H264.QP.PFrame.Description="Фіксоване значення QP (параметр квантування) для P-кадрів."
-AMF.H264.QP.BFrame="QP для B-кадрів"
-AMF.H264.QP.BFrame.Description="Фіксоване значення QP (параметр квантування) для B-кадрів."
-AMF.H264.VBVBuffer="Буфер VBV"
-AMF.H264.VBVBuffer.Description="Метод, що використовується для визначення параметру Розмір VBV буфера:\n 'Автоматично' - енкодер вираховує розмір враховуючи параметр Буфер VBV, кореляція.\n 'Вручну' - дозволяє користувачеві контролювати цей розмір.\nVBV (Video Buffering Verifier) Буфер використовується різними Методами керування потоком для збереження бітрейту у зазначених межах."
-AMF.H264.VBVBuffer.Strictness="Буфер VBV, кореляція"
-AMF.H264.VBVBuffer.Strictness.Description="Визначає як близько значення Буфер VBV дорівнює заданому, у відсотках."
-AMF.H264.VBVBuffer.Size="Розмір VBV буфера"
-AMF.H264.VBVBuffer.Size.Description="Розмір VBV буфера (Video Buffering Verifier) використовується під час контролю бітрейта відео."
-AMF.H264.VBVBuffer.Fullness="Початкова повнота буферу VBV"
-AMF.H264.VBVBuffer.Fullness.Description="Визначає початкову заповнюваність Буферу VBV, діє лише на початку кодування."
-AMF.H264.FillerData="Заповнювати пустоти у бітрейту"
-AMF.H264.FillerData.Description="Заповнювати пустоти у бітрейту - це наповнювання інформацією, яку декодер просто ігнорує. Якщо увімкнено то зайве місце у бітрейті буде заповнено нічого не значущою інформацією за для досягнення точного значення Бажаний бітрейт. Аналог Filler Data."
-AMF.H264.FrameSkipping="Пропускати кадри"
-AMF.H264.FrameSkipping.Description="Пропускати кадри - дозволяє енкодеру відкидати кадри, якщо керування потоком не може досягти встановленого значення Бажаний бітрейт.\nКоли енкодер пропускає кадр, він додає дублювати-останній-кадр NAL до потоку.\nМоже допомогти при встановленні дуже низьких значень Бажаний бітрейт."
-AMF.H264.EnforceHRDCompatibility="Застосувати примусову сумісність з HRD"
-AMF.H264.EnforceHRDCompatibility.Description="Застосувати примусову сумісність з HRD (Hypothetical Reference Decoder) - накладає обмеження на максимальне значення QP (параметра квантування) у кадрі задля досягнення сумісності з HRD.\nНе рекомендується для записів або трансляцій. Використовувати лише для сумісності з найстарішими пристроями які мають лише базовий декодер."
-AMF.H264.KeyframeInterval="Інтервал ключових кадрів"
-AMF.H264.KeyframeInterval.Description="Визначає кількість секунд між двома повними кадрами (ключовими кадрами).\nТакож контролює довжину послідовності кадрів у групі зображень (GOP)."
-AMF.H264.IDRPeriod="IDR, період (кадрів)"
-AMF.H264.IDRPeriod.Description="Визначає відстань між Instantaneous Decoding Refreshes (IDR) (ключовими кадрами), в кадрах.\nТакож контролює довжину послідовності кадрів у групі зображень (GOP)."
-AMF.H264.BFrame.Pattern="B-кадри"
-AMF.H264.BFrame.Pattern.Description="Визначає кількість послідовних B-кадрів у кодуванні.\nПідтримується принаймні 2-м та 3-м поколінням VCE карт. Маєнегативний вплив на продуктивність."
-AMF.H264.BFrame.DeltaQP="B-кадри, відхил QP"
-AMF.H264.BFrame.DeltaQP.Description="Відхил QP (параметра квантування) для не опорних B-кадрів, по відношенню до I-кадрів та P-кадрів."
-AMF.H264.BFrame.Reference="B-кадри як опорні"
-AMF.H264.BFrame.Reference.Description="Дозволяє робити B-кадри опорними для інших B-кадрів в додаток до вже існуючих P- та I-кадрів."
-AMF.H264.BFrame.ReferenceDeltaQP="Відхил QP (параметра квантування) для опорних B-кадрів"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="Відхил QP (параметра квантування) для опорних B-кадрів, по відношенню до I-кадрів та P-кадрів."
-AMF.H264.DeblockingFilter="Деблокінг-фільтр"
-AMF.H264.DeblockingFilter.Description="Встановлює позначку, що дозволяє декодеру використовувати Деблокінг-фільтр для цього відео."
-AMF.H264.ScanType="Вид розгортки"
-AMF.H264.ScanType.Description="Визначає вид розгортки який треба використовувати у кодуванні,\nзавжди користуйтеся значенням 'Прогресивна'."
-AMF.H264.ScanType.Progressive="Прогресивна"
-AMF.H264.ScanType.Interlaced="Черезрядкова"
-AMF.H264.MotionEstimation="Оцінка руху"
-AMF.H264.MotionEstimation.Description="Оцінка руху дозволяє енкодеру зменшити вимоги до бітрейту завдяки розрахункам з переміщення пікселів."
-AMF.H264.MotionEstimation.None="Немає"
-AMF.H264.MotionEstimation.Half="Пів-пікселя"
-AMF.H264.MotionEstimation.Quarter="Чверть-пікселя"
-AMF.H264.MotionEstimation.Both="Пів-пікселя та Чверть-пікселя"
-AMF.H264.CodingType="Схема кодування"
-AMF.H264.CodingType.Description="Визначає яку схему кодування використовувати:\n 'За замовчанням' - програма вирішує самостійно (рекомендується).\n 'CABAC' - контекстно-залежне адаптивне бінарне арифметичне кодування, схема має кращу компресію даних, але здійснює більше навантаження.\n 'CAVLC' - контекстно-залежне адаптивне кодування із змінною довжиною кодового слова, схема має дещо меншу компресію даних, але й здійснює менше навантаження."
+Utility.Default="За замовчанням"
+Utility.Automatic="Автоматично"
+Utility.Manual="Вручну"
+Utility.Switch.Disabled="Вимкнено"
+Utility.Switch.Enabled="Увімкнено"
+Preset="Шаблон"
+Preset.ResetToDefaults="Відновити параметри за замовчанням"
+Preset.Recording="Звичайний запис"
+Preset.HighQuality="Висока якість"
+Preset.Indistinguishable="Якість майже без втрат"
+Preset.Lossless="Без втрат якості"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Використання"
+Usage.Description="Тип використання AMF або напрямок його використання, тобто:\n 'Для перекодування' - загальний тип налаштувань (рекомендований).\n 'З ультра-низькою затримкою' - тип лише для дійсно низької затримки під час кодування.\n 'З низькою затримкою' - схожий на зазначений вище, але з трохи більшою затримкою.\n\nТрансляції підтримують лише напрямок використання 'Для перекодування'."
+Usage.Transcoding="Для перекодування"
+Usage.UltraLowLatency="З ультра-низькою затримкою"
+Usage.LowLatency="З низькою затримкою"
+Usage.Webcam="Для вебкамери"
+QualityPreset="Шаблон якості"
+QualityPreset.Description="Визначає який Шаблон якості AMF намагатиметься встановити:\n 'Швидкість' - є найшвидшим, але має гіршу якість відео.\n 'Збалансований' - це баланс між шаблонами Швидкість та Якість.\n 'Якість' - дає найкращу якість відео для заданого бітрейту."
+QualityPreset.Speed="Швидкість"
+QualityPreset.Balanced="Збалансований"
+QualityPreset.Quality="Якість"
+Profile="Профіль"
+Profile.Description="Зазначений профіль буде використано енкодером. Від того що має найвищу сумісність (вище) до найвищої якості (нижче)."
+ProfileLevel="Рівень профілю"
+ProfileLevel.Description="Визначає рівень профілю який буде використано під час кодування. 'Автоматично' - програма сама встановить найкращій рівень."
+Tier="Рівень додатку"
+Tier.Description="Рівень додатку (або Tier). 'High' використовується для змісту з високим бітрейтом або професійне використання, тоді як 'Main' застосовується в усіх інших випадках."
+AspectRatio="Пропорції"
+AspectRatio.Description="Співвідношення сторін яке буде записано до файлу Виводу."
+CodingType="Схема кодування"
+CodingType.Description="Визначає яку схему кодування використовувати:\n 'За замовчанням' - програма вирішує самостійно (рекомендується).\n 'CABAC' - контекстно-залежне адаптивне бінарне арифметичне кодування, схема має кращу компресію даних, але здійснює більше навантаження.\n 'CAVLC' - контекстно-залежне адаптивне кодування із змінною довжиною кодового слова, схема має дещо меншу компресію даних, але й здійснює менше навантаження."
+MaximumReferenceFrames="Максимальна кількість опорних кадрів"
+MaximumReferenceFrames.Description="Визначає яку кількість кадрів дозволено енкодеру використовувати у якості опорних. Має безпосередній вплив на якість кодування."
+RateControlMethod="Метод керування потоком"
+RateControlMethod.Description="Визначає метод керування потоком:\n 'Фіксований QP (CQP)' - встановлює фіксовані QP для I-/P-/B-кадрів,\n 'Постійний бітрейт (CBR)' - дотримується значення Бажаний бітрейт потоку (використовуючи\nопцію Заповнювати пустоти у бітрейту) (рекомендується для трансляцій),\n 'Змінний бітрейт (максимальний бітрейт обмежено) (VBR)' - потік завжди залишається нижче значення Максимальний бітрейт,\n 'Змінний бітрейт (затримку обмежено) (VBR_LAT)' - потік буде збережено близько до значення Бажаний бітрейт,\nдопоки затримка графічного адаптеру та навантаження на нього дозволятимуть це,\nв іншому випадку бітрейт буде збільшено (рекомендується для запису).\n\nQP - Quantization Parameter (параметр квантування)."
+RateControlMethod.CQP="Фіксований QP (CQP)"
+RateControlMethod.CBR="Постійний бітрейт (CBR)"
+RateControlMethod.VBR="Змінний бітрейт (максимальний бітрейт обмежено) (VBR)"
+RateControlMethod.VBRLAT="Змінний бітрейт (затримку обмежено) (VBR_LAT)"
+PrePassMode="Попередній прохід"
+PrePassMode.Description="Це ще один прохід енкодера, що дозволяє ще компактніше розташувати блоки даних у наданому бітрейту. Ефективність методу залежить від апаратного забезпечення."
+PrePassMode.Quarter="Увімкнено (на чверть від повного розміру)"
+PrePassMode.Half="Увімкнено (на половину від повного розміру)"
+PrePassMode.Full="Увімкнено (на повний розмір)"
+Bitrate.Target="Бажаний бітрейт"
+Bitrate.Target.Description="Енкодер буде намагатися дотримуватись цього бітрейту."
+Bitrate.Peak="Максимальний бітрейт"
+Bitrate.Peak.Description="Енкодер буде намагатися не перевищувати цього бітрейту."
+QP.IFrame="QP для I-кадрів"
+QP.IFrame.Description="Фіксоване значення QP (параметр квантування) для I-кадрів."
+QP.PFrame="QP для P-кадрів"
+QP.PFrame.Description="Фіксоване значення QP (параметр квантування) для P-кадрів."
+QP.BFrame="QP для B-кадрів"
+QP.BFrame.Description="Фіксоване значення QP (параметр квантування) для B-кадрів."
+QP.Minimum="Мінімальний QP"
+QP.Minimum.Description="Найнижчий QP (параметр квантування) в кадрі."
+QP.IFrame.Minimum="Мінімальний QP для I-кадрів"
+QP.IFrame.Minimum.Description="Найнижчий QP (параметр квантування) для I-кадрів."
+QP.PFrame.Minimum="Мінімальний QP для P-кадрів"
+QP.PFrame.Minimum.Description="Найнижчий QP (параметр квантування) для P-кадрів."
+QP.Maximum="Максимальний QP"
+QP.Maximum.Description="Найвищий QP (параметр квантування) в кадрі."
+QP.IFrame.Maximum="Максимальний QP для I-кадрів"
+QP.IFrame.Maximum.Description="Найвищий QP (параметр квантування) для I-кадрів."
+QP.PFrame.Maximum="Максимальний QP для P-кадрів"
+QP.PFrame.Maximum.Description="Найвищий QP (параметр квантування) для P-кадрів."
+FillerData="Заповнювати пустоти у бітрейту"
+FillerData.Description="Заповнювати пустоти у бітрейту - це наповнювання інформацією, яку декодер просто ігнорує. Якщо увімкнено, то зайве місце у бітрейті буде заповнено нічого не значущою інформацією за для дотримання точного значення Бажаний бітрейт. Аналог Filler Data."
+FrameSkipping="Пропускати кадри"
+FrameSkipping.Description="Пропускати кадри - дозволяє енкодеру відкидати кадри, якщо керування потоком не може досягти встановленого значення Бажаний бітрейт.\nКоли енкодер пропускає кадр, він додає дублювати-останній-кадр NAL до потоку.\nМоже допомогти при встановленні дуже низьких значень Бажаний бітрейт."
+VBAQ="VBAQ"
+VBAQ.Description="Увімкнути використання функції Variance Based Adaptive Quantization (VBAQ), яка базується на дисперсії пікселів задля кращого використання наявного бітрейту.\nЗаощаджує на комплексних текстурах, до яких око людини менш чутливе, на користь плавних переходів в інших місцях.\nЗа рахунок цього, може дещо поліпшити якість зображення."
+EnforceHRD="Застосувати примусову сумісність з HRD"
+EnforceHRD.Description="Застосувати примусову сумісність з HRD (Hypothetical Reference Decoder) - накладає обмеження на енкодер задля забезпечення норм стандартного потоку."
+VBVBuffer="Буфер VBV"
+VBVBuffer.Description="Метод, що використовується для визначення параметру Розмір VBV буфера:\n 'Автоматично' - енкодер вираховує розмір враховуючи параметр Буфер VBV, кореляція.\n 'Вручну' - дозволяє користувачеві контролювати цей розмір.\n\nVBV (Video Buffering Verifier) Буфер використовується різними Методами керування потоком для збереження бітрейту у зазначених межах."
+VBVBuffer.Strictness="Буфер VBV, кореляція"
+VBVBuffer.Strictness.Description="Визначає як близько значення Буфер VBV дорівнює заданому, у відсотках."
+VBVBuffer.Size="Розмір VBV буфера"
+VBVBuffer.Size.Description="Розмір VBV буфера (Video Buffering Verifier) використовується під час контролю бітрейта відео."
+VBVBuffer.InitialFullness="Початкова повнота буферу VBV"
+VBVBuffer.InitialFullness.Description="Визначає початкову заповнюваність Буферу VBV (у відсотках), діє лише на початку кодування."
+KeyframeInterval="Інтервал ключових кадрів"
+KeyframeInterval.Description="Визначає кількість секунд між двома повними кадрами (ключовими кадрами)."
+H264.IDRPeriod="IDR, період (кадрів)"
+H264.IDRPeriod.Description="Визначає відстань між Instantaneous Decoding Refreshes (IDR) (ключовими кадрами), в кадрах.\nТакож контролює довжину послідовності кадрів у групі зображень (GOP)."
+H265.IDRPeriod="IDR, період (групи зображень)"
+H265.IDRPeriod.Description="Визначає відстань між Instantaneous Decoding Refreshes (IDR) (ключовими кадрами), тут вимірюється у кількості груп зображень (GOP) між двома IDR."
+GOP.Type="Тип розміру GOP"
+GOP.Type.Description="Визначає який тип розміру групи зображень (GOP) буде використано у кодуванні:\n 'Фіксований' - постійна відстань між двома групами зображень.\n 'Змінний' - розмір кожної групи зображень може змінюватись у часі.\nФіксований розмір групи зображень є часткою стандарту H.264 та найбільш відповідає трансляціям, у той час як Змінний розмір групи зображень краще застосовувати у високоякісних записах малого розміру."
+GOP.Type.Fixed="Фіксований"
+GOP.Type.Variable="Змінний"
+GOP.Size="Розмір GOP"
+GOP.Size.Description="Розмір групи зображень (GOP), у кадрах."
+GOP.Size.Minimum="Мінімальний розмір GOP"
+GOP.Size.Minimum.Description="Мінімальний розмір групи зображень (GOP), у кадрах."
+GOP.Size.Maximum="Максимальний розмір GOP"
+GOP.Size.Maximum.Description="Максимальний розмір групи зображень (GOP), у кадрах."
+GOP.Alignment="Вирівнювання GOP"
+GOP.Alignment.Description="Експериментально, ефект не відомо. Використовувати лише на свій страх і ризик."
+BFrame.Pattern="Послідовні B-кадри"
+BFrame.Pattern.Description="Визначає кількість послідовних B-кадрів у кодуванні.\nПідтримується принаймні 2-м та 3-м поколінням VCE карт. Має негативний вплив на продуктивність."
+BFrame.DeltaQP="B-кадри, відхил QP"
+BFrame.DeltaQP.Description="Відхил QP (параметра квантування) для не опорних B-кадрів, по відношенню до I-кадрів та P-кадрів."
+BFrame.Reference="B-кадри як опорні"
+BFrame.Reference.Description="Дозволяє робити B-кадри опорними для інших B-кадрів в додаток до вже існуючих P- та I-кадрів."
+BFrame.ReferenceDeltaQP="Відхил QP для опорних B-кадрів"
+BFrame.ReferenceDeltaQP.Description="Відхил QP (параметра квантування) для опорних B-кадрів, по відношенню до I-кадрів та P-кадрів."
+DeblockingFilter="Деблокінг-фільтр"
+DeblockingFilter.Description="Встановлює позначку, що дозволяє декодеру використовувати Деблокінг-фільтр для цього відео."
+MotionEstimation="Оцінка руху"
+MotionEstimation.Description="Оцінка руху дозволяє енкодеру зменшити вимоги до бітрейту завдяки розрахункам з переміщення пікселів."
+MotionEstimation.Quarter="Чверть-пікселя"
+MotionEstimation.Half="Пів-пікселя"
+MotionEstimation.Full="Пів-пікселя та Чверть-пікселя"
+Video.API="Відео API"
+Video.API.Description="Визначає який API використовувати для кодування."
+Video.Adapter="Відеоадаптер"
+Video.Adapter.Description="Визначає який відеоадаптер використовувати для кодування."
+OpenCL="OpenCL"
+OpenCL.Description="Чи буде енкодер використовувати OpenCL щоб надсилати кожен окремий кадр? Технічно швидкий, але може конфліктувати з драйвером Intel (не сумісні бібліотеки OpenCL)."
+View="Вид для налаштувань"
+View.Description="Визначає кількість наявних опцій для налаштування.\nНавряд чи розробник вам допомагатиме, якщо ви оберете рівень 'Бог енкодерів'."
+View.Basic="Базовий"
+View.Advanced="Розширений"
+View.Expert="Експерт"
+View.Master="Бог енкодерів"
+Debug="Відладка"
+Debug.Description="Вмикає занесення до лог-файлу додаткової інформації про процес кодування. Потрібен перезапуск Open Broadcaster Software Studio з ключами '--verbose --log_unfiltered' (вводити без '). Корисно у випадку коли ви бажаєте звернутися за допомогою до розробника енкодера."
AMF.H264.MaximumLTRFrames="Кількість LTR-кадрів контрольованих користувачем"
AMF.H264.MaximumLTRFrames.Description="Long-Term Reference (LTR) кадри дозволяють при кодуванні маркувати декілька кадрів як опорні для інших.\nЯкщо 'Кількість LTR-кадрів контрольованих користувачем' вказано, то енкодер не підтримуватиме B-кадри у відео."
AMF.H264.MaximumAccessUnitSize="Максимальний розмір Access Unit"
AMF.H264.MaximumAccessUnitSize.Description="Визначає максимальний розмір Access Unit для NAL. Значення 0 дозволяє енкодеру обирати найкращий розмір самостійно."
AMF.H264.HeaderInsertionSpacing="Заголовки потоку, період (кадрів)"
AMF.H264.HeaderInsertionSpacing.Description="Визначає скільки кадрів повинно бути між NAL заголовками. Не рекомендується змінювати цей параметр від значення 0 (автоматично)."
-AMF.H264.GOPSize="Розмір GOP"
-AMF.H264.MaximumReferenceFrames="Максимальна кількість опорних кадрів"
AMF.H264.SlicesPerFrame="Фрагментів на кадр"
AMF.H264.SlicesPerFrame.Description="Скільки фрагментів I-кадрів буде у кожному кадру?\nНульове значення дозволяє енкодеру визначати цю кількість під час кодування самостійно.\nIntra-Refresh кодування використовується для більш швидкого відтворення та навігації."
AMF.H264.IntraRefresh.NumberOfStripes="Кількість рядків Intra-Refresh"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Кількість Intra-Refresh макроблоків на слот"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="Визначає скільки Intra-Refresh макроблоків на слот можна використовувати при кодуванні.\nНульове значення вимикає цю опцію.\nIntra-Refresh кодування використовується для більш швидкого відтворення та навігації."
-AMF.H264.VideoAPI="Відео API"
-AMF.H264.VideoAPI.Description="Визначає який API використовувати для кодування."
-AMF.H264.VideoAdapter="Відеоадаптер"
-AMF.H264.VideoAdapter.Description="Визначає який відеоадаптер використовувати для кодування."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="Чи буде енкодер використовувати OpenCL щоб надсилати кожен окремий кадр?"
-AMF.H264.View="Вид для налаштувань"
-AMF.H264.View.Description="Визначає кількість наявних опцій для налаштування. Навряд чи розробник вам допомагатиме, якщо ви оберете рівень 'Експерт' або 'Бог енкодерів'."
-AMF.H264.View.Basic="Базовий"
-AMF.H264.View.Advanced="Розширений"
-AMF.H264.View.Expert="Експерт"
-AMF.H264.View.Master="Бог енкодерів"
-AMF.H264.Debug="Відладка"
-AMF.H264.Debug.Description="Вмикає занесення до лог-файлу додаткової інформації про процес кодування. Корисно у випадку коли ви бажаєте звернутися за допомогою до розробника енкодера."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/vi-VN.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/vi-VN.ini
Changed
-AMF.Util.Default="Mặc định"
-AMF.Util.Automatic="Tự động"
-AMF.Util.Manual="Thủ công"
-AMF.Util.Toggle.Disabled="Tắt"
-AMF.Util.Toggle.Enabled="Bật"
-AMF.H264.Preset="Mẫu thiết lập"
-AMF.H264.Preset.ResetToDefaults="Thiết lập về mặc định"
-AMF.H264.Preset.Recording="Quay video"
-AMF.H264.Preset.HighQuality="Chất lượng cao"
-AMF.H264.Preset.Indistinguishable="Không thể phân biệt"
-AMF.H264.Preset.Lossless="Lossless"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="Sử dụng"
-AMF.H264.Usage.Transcoding="Chuyển mã"
-AMF.H264.Usage.UltraLowLatency="Độ trễ cực thấp"
-AMF.H264.Usage.LowLatency="Độ trễ thấp"
-AMF.H264.QualityPreset="Quality Preset"
-AMF.H264.QualityPreset.Speed="Tốc độ"
-AMF.H264.QualityPreset.Balanced="Cân bằng"
-AMF.H264.QualityPreset.Quality="Chất lượng"
-AMF.H264.Profile="Profile"
-AMF.H264.ProfileLevel="Profile Level"
-AMF.H264.RateControlMethod="Cách kiểm soát Bitrate"
-AMF.H264.RateControlMethod.CQP="QP Không thay đổi (CQP)"
-AMF.H264.RateControlMethod.CBR="Bitrate Không thay đổi (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="Bitrate Thay đổi được (Phụ thuộc vào Bitrate Cao nhất) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="Bitrate Thay đổi được (Phụ thuộc vào độ trễ) (VBR_LAT)"
-AMF.H264.Bitrate.Target="Bitrate Mục tiêu"
-AMF.H264.Bitrate.Peak="Bitrate Cao nhất"
-AMF.H264.QP.Minimum="QP Thấp nhất"
-AMF.H264.QP.Maximum="QP Cao nhất"
-AMF.H264.QP.IFrame="I-Frame QP"
-AMF.H264.QP.PFrame="P-Frame QP"
-AMF.H264.QP.BFrame="B-Frame QP"
-AMF.H264.FillerData="Filler Data (Thêm data cho đủ bitrate)"
-AMF.H264.FrameSkipping="Frame Skipping"
-AMF.H264.EnforceHRDCompatibility="Enforce HRD Compatibility"
-AMF.H264.KeyframeInterval="Thời gian đặt keyframe"
-AMF.H264.DeblockingFilter="De-blocking Filter (Lọc chống nhiễu ảnh)"
-AMF.H264.ScanType="Quét hình loại"
-AMF.H264.ScanType.Progressive="Quét nguyên ảnh (Progressive)"
-AMF.H264.ScanType.Interlaced="Quét 1/2 ảnh (Interlaced)"
-AMF.H264.MotionEstimation="Dự đoán bù trừ chuyển động"
-AMF.H264.MotionEstimation.Description="Dự toán chuyển động cho phép bộ mã hóa giảm bitrate cần thiết bằng cách ước tính một pixel sẽ đi đâu."
-AMF.H264.MotionEstimation.None="Không"
-AMF.H264.MotionEstimation.Half="1/2 Pixel"
-AMF.H264.MotionEstimation.Quarter="1/4 Pixel"
-AMF.H264.MotionEstimation.Both="1/2 và 1/4 Pixel"
-AMF.H264.CodingType="Coding Type"
+Utility.Default="Mặc định"
+Utility.Automatic="Tự động"
+Utility.Manual="Thủ công"
+Utility.Switch.Disabled="Tắt"
+Utility.Switch.Enabled="Bật"
+Preset="Mẫu thiết lập"
+Preset.ResetToDefaults="Thiết lập về mặc định"
+Preset.Recording="Quay video"
+Preset.HighQuality="Chất lượng cao"
+Preset.Indistinguishable="Không thể phân biệt"
+Preset.Lossless="Lossless"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="Sử dụng"
+Usage.Transcoding="Chuyển mã"
+Usage.UltraLowLatency="Độ trễ cực thấp"
+Usage.LowLatency="Độ trễ thấp"
+Usage.Webcam="Webcam"
+QualityPreset="Quality Preset"
+QualityPreset.Speed="Tốc độ"
+QualityPreset.Balanced="Cân bằng"
+QualityPreset.Quality="Chất lượng"
+Profile="Hồ sơ"
+ProfileLevel="Profile Level"
+AspectRatio="Tỉ lệ"
+CodingType="Coding Type"
+RateControlMethod="Cách kiểm soát Bitrate"
+RateControlMethod.CQP="QP Không thay đổi (CQP)"
+RateControlMethod.CBR="Bitrate Không thay đổi (CBR)"
+RateControlMethod.VBR="Bitrate Thay đổi được (Phụ thuộc vào Bitrate Cao nhất) (VBR)"
+RateControlMethod.VBRLAT="Bitrate Thay đổi được (Phụ thuộc vào độ trễ) (VBR_LAT)"
+PrePassMode="Pre-Pass Mode"
+Bitrate.Target="Bitrate Mục tiêu"
+Bitrate.Peak="Bitrate Cao nhất"
+QP.IFrame="I-Frame QP"
+QP.PFrame="P-Frame QP"
+QP.BFrame="B-Frame QP"
+QP.Minimum="QP Thấp nhất"
+QP.Minimum.Description="Giá trị QP nhỏ nhất sử dụng trong 1 khung ảnh."
+QP.IFrame.Minimum="I-Frame QP Thấp nhất"
+QP.IFrame.Minimum.Description="Giá trị QP nhỏ nhất sử dụng trong 1 I-Frame."
+QP.PFrame.Minimum="P-Frame QP Thấp nhất"
+QP.PFrame.Minimum.Description="Giá trị QP nhỏ nhất sử dụng trong 1 P-Frame."
+QP.Maximum="QP Cao nhất"
+QP.Maximum.Description="Giá trị QP lớn nhất sử dụng trong 1 khung ảnh."
+QP.IFrame.Maximum="I-Frame QP Cao nhất"
+QP.IFrame.Maximum.Description="Giá trị QP lớn nhất sử dụng trong 1 I-Frame."
+QP.PFrame.Maximum="P-Frame QP Cao nhất"
+QP.PFrame.Maximum.Description="Giá trị QP lớn nhất sử dụng trong P-Frame."
+FillerData="Filler Data (Thêm data cho đủ bitrate)"
+FillerData.Description="Bật Filler Data cho phép bộ mã hóa giữ ít nhất \@Bitrate.Target\@ bằng cách điền vào không gian còn lại trong một chuỗi thông tin trống."
+VBAQ="VBAQ"
+EnforceHRD="Ép sử dụng HRD"
+KeyframeInterval="Thời gian đặt keyframe"
+KeyframeInterval.Description="Khoảng thời gian (bằng giây) giữa các khung hình."
+GOP.Type="Loại GOP"
+GOP.Type.Fixed="Cố định"
+GOP.Type.Variable="Biến"
+GOP.Size="Kích cỡ GOP"
+GOP.Size.Description="Kích thước của một GOP (Nhóm ảnh) trong khung hình."
+GOP.Size.Minimum="Kích thước GOP tối thiểu"
+GOP.Size.Minimum.Description="Kích thước tối thiểu của một GOP (Nhóm ảnh) trong khung hình."
+GOP.Size.Maximum="Kích thước GOP tối đa"
+GOP.Size.Maximum.Description="Kích thước tối đa của một GOP (Nhóm ảnh) trong khung hình."
+GOP.Alignment.Description="Thử nghiệm, kết quả chưa biết. Dùng nó có thể tạo một số rủi ro."
+DeblockingFilter="De-blocking Filter (Lọc chống nhiễu ảnh)"
+DeblockingFilter.Description="Cho phép bộ giải mã áp dụng lọc chống nhiễu ảnh."
+MotionEstimation="Dự đoán bù trừ chuyển động"
+MotionEstimation.Description="Dự đoán bù trừ chuyển động cho phép bộ mã hóa giảm bitrate cần thiết bằng cách ước tính một pixel sẽ đi đâu."
+MotionEstimation.Quarter="1/4 Pixel"
+MotionEstimation.Half="1/2 Pixel"
+MotionEstimation.Full="1/4 & 1/2 Pixel"
+Video.API="Video API"
+Video.Adapter="Card đồ họa"
+OpenCL="OpenCL"
+OpenCL.Description="Nên sử dụng OpenCL để gửi các khung? Về mặt kỹ thuật thì nhanh hơn nhưng có thể gây ra vấn đề với driver của Intel (do thư viện OpenCL không tương thích)."
+View="Chế độ xem"
+View.Description="Những thuộc tính gì sẽ được hiển thị?\nNếu sử dụng '\@View.Master\@' thì bạn sẽ không nhận được hỗ trợ."
+View.Basic="Cơ bản"
+View.Advanced="Nâng cao"
+View.Expert="Chuyên gia"
+View.Master="Bật cả tính năng ẩn"
+Debug="Gỡ lỗi"
+Debug.Description="Cho phép hiện thông báo gỡ lỗi bổ sung. Đòi hỏi bạn phải chạy Open Broadcaster Software Studio với dòng lệnh --verbose --log_unfiltered."
AMF.H264.MaximumLTRFrames="LTR Frames tối đa"
-AMF.H264.GOPSize="Kích cỡ GOP"
-AMF.H264.GOPSize.Description="Không rõ, thử nghiệm"
+AMF.H264.WaitForTask.Description="Không rõ, thử nghiệm"
AMF.H264.SlicesPerFrame="Lát cho mỗi khung hình"
-AMF.H264.VideoAPI="Video API"
-AMF.H264.VideoAdapter="Card đồ họa"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.View="Chế độ xem"
-AMF.H264.View.Basic="Cơ bản"
-AMF.H264.View.Advanced="Nâng cao"
-AMF.H264.View.Expert="Chuyên gia"
-AMF.H264.View.Master="Bật cả tính năng ẩn"
-AMF.H264.Debug="Gỡ lỗi"
+AMF.H264.SliceMode.Description="Không rõ, thử nghiệm"
+AMF.H264.MaximumSliceSize.Description="Không rõ, thử nghiệm"
+AMF.H264.SliceControlMode.Description="Không rõ, thử nghiệm"
+AMF.H264.SliceControlSize.Description="Không rõ, thử nghiệm"
+AMF.H264.IntraRefresh.NumberOfStripes.Description="Không rõ, thử nghiệm"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/zh-CN.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/zh-CN.ini
Changed
-AMF.Util.Default="默认"
-AMF.Util.Automatic="自动"
-AMF.Util.Manual="手动"
-AMF.Util.Toggle.Disabled="禁用"
-AMF.Util.Toggle.Enabled="启用"
-AMF.H264.Preset="预设"
-AMF.H264.Preset.ResetToDefaults="重置为默认值"
-AMF.H264.Preset.Recording="录像"
-AMF.H264.Preset.HighQuality="高质量"
-AMF.H264.Preset.Indistinguishable="无法区分"
-AMF.H264.Preset.Lossless="无损"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="使用"
-AMF.H264.Usage.Description="AMF 应调整为什么用法:\n-'转码' 是通用转码(推荐), \n '超低延迟' 对于真的低延迟编码,\n-'低延迟' 类似于上面稍高的延迟.\n 推流仅支持 '转码', 所有其他值可以用于录制."
-AMF.H264.Usage.Transcoding="转码"
-AMF.H264.Usage.UltraLowLatency="超低延迟"
-AMF.H264.Usage.LowLatency="低延迟"
-AMF.H264.QualityPreset="质量预设"
-AMF.H264.QualityPreset.Description="哪种质量预设 AMF 应该尝试对于目标:\n-'速度' 是最快但有最差的质量,\n-'平衡' 是 '速度' 和 '质量' 之间, 提供一个两者之间的较好的平衡,\n-'质量' 给出了最好的质量对于给定的比特率."
-AMF.H264.QualityPreset.Speed="速度"
-AMF.H264.QualityPreset.Balanced="平衡"
-AMF.H264.QualityPreset.Quality="质量"
-AMF.H264.Profile="配置文件"
-AMF.H264.Profile.Description="H.264的编码选项有:\n-'Baseline' 支持的平台最广,\n-'Main' 支持较老的设备(如果使用的平台是手机,推荐使用),\n-'High' 被当下的设备支持(推荐使用)。"
-AMF.H264.ProfileLevel="配置等级"
-AMF.H264.ProfileLevel.Description="对于编码使用哪种 H.264 Profile Level:\n-'Automatic' 根据给定的帧率和帧大小计算最优的 profile level,\n-'4.1' 支持 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n-'4.2' 支持 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n-'5.0' 支持 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n-'5.1' 支持 3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n-'5.2' 支持 3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="速率控制方法"
-AMF.H264.RateControlMethod.Description="该使用什么调节码率的方法:\n- '\@AMF.H264.RateControlMethod.CQP\@' assigns fixed I-/P-/B-Frame QP (Quantization Parameter) values,\n- '\@AMF.H264.RateControlMethod.CBR\@' stays at the given Target Bitrate (using Filler Data) (推荐推流使用),\n- '\@AMF.H264.RateControlMethod.VBR\@' stays below the given Peak Bitrate,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' stays close to the Target Bitrate if GPU latency and load allow for it, otherwise will use higher bitrate (推荐录像使用)."
-AMF.H264.RateControlMethod.CQP="恒定 QP (CQP)"
-AMF.H264.RateControlMethod.CBR="固定比特率 (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="变比特率 (峰值约束) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="变比特率 (延迟约束) (VBR_LAT)"
-AMF.H264.Bitrate.Target="目标比特率"
-AMF.H264.Bitrate.Target.Description="尝试在这个序列中获取的比特率."
-AMF.H264.Bitrate.Peak="峰值比特率"
-AMF.H264.Bitrate.Peak.Description="尝试在这个序列中获取的峰值比特率."
-AMF.H264.QP.Minimum="最低 QP"
-AMF.H264.QP.Minimum.Description="一帧中最低 QP (量化参数) 值"
-AMF.H264.QP.Maximum="最高 QP"
-AMF.H264.QP.Maximum.Description="一帧中最高 QP (量化参数) 值"
-AMF.H264.QP.IFrame="I 帧 QP"
-AMF.H264.QP.IFrame.Description="用于 I 帧的固定的 QP 值."
-AMF.H264.QP.PFrame="P 帧 QP"
-AMF.H264.QP.PFrame.Description="用于 P 帧的固定 QP 值."
-AMF.H264.QP.BFrame="B 帧 QP"
-AMF.H264.QP.BFrame.Description="用于 B 帧的固定 QP (量化参数) 值."
-AMF.H264.VBVBuffer="VBV 缓存"
-AMF.H264.VBVBuffer.Description="应使用什么方法来确定 VBV 缓冲区大小:\n-'自动' 计算使用严格约束的大小,\n- '手册' 允许用户控制大小.\nVBV缓存 (视频缓冲程序验证程序) 用于某些率控制方法来保持整体的比特率在给定的约束内."
-AMF.H264.VBVBuffer.Strictness="VBV 缓存规范性"
-AMF.H264.VBVBuffer.Strictness.Description="决定 VBV 缓存的严格性, 100% 表示尽可能严格, 0% 表示无限制."
-AMF.H264.VBVBuffer.Size="VBV 缓存大小"
-AMF.H264.VBVBuffer.Size.Description="用于在一个序列中的控制比特率的 VBV 缓存的大小."
-AMF.H264.VBVBuffer.Fullness="VBV 缓冲满"
-AMF.H264.VBVBuffer.Fullness.Description="VBV 缓存最初应该多满, 将只会影响最初的序列的编码."
-AMF.H264.FillerData="填充数据"
-AMF.H264.FillerData.Description="启用填充数据允许编码器至少保持目标比特率, 通过填满空信息序列中的剩余空间."
-AMF.H264.FrameSkipping="跳过的帧"
-AMF.H264.FrameSkipping.Description="跳过的帧允许编码器下丢帧, 以满足目标比特率要求.\n当编码器丢帧时, 它反而插入一个重复最后一帧的 NAL 到流中.\n可以帮助非常低的目标比特率."
-AMF.H264.EnforceHRDCompatibility="强制 HRD 兼容"
-AMF.H264.EnforceHRDCompatibility.Description="强制假设的参考解码器限制在一个帧内的最大的 QP 值更改.\n不推荐用于录制或推流, 并且应仅用于面向只有参考软件解码器的很老的设备."
-AMF.H264.KeyframeInterval="关键帧间隔"
-AMF.H264.KeyframeInterval.Description="非丢弃帧之间应该多少秒.\n也控制序列(GOP) 的大小."
-AMF.H264.IDRPeriod="IDR 周期"
-AMF.H264.IDRPeriod.Description="定义的帧内瞬时解码刷新 (IDR) 的距离. 也控制 GOP 序列长度."
-AMF.H264.BFrame.Pattern="B 帧"
-AMF.H264.BFrame.Pattern.Description="多少 B 帧应该用于编码.\n2 代和 3 代 VCE 卡支持. 对编码性能有负面影响."
-AMF.H264.BFrame.DeltaQP="B 帧差值 QP"
-AMF.H264.BFrame.DeltaQP.Description="对于非参考 B 帧, 相对于上一个 I 或者 P 帧的差值 QP 值."
-AMF.H264.BFrame.Reference="参考 B 帧"
-AMF.H264.BFrame.Reference.Description="允许 B 帧也使用 B 帧作为参考, 而不是只是 P 和 I 帧."
-AMF.H264.BFrame.ReferenceDeltaQP="参考 B 帧差值 QP"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="对于参考 B 帧, 相对于上一个 I 或者 P 帧的差值 QP 值."
-AMF.H264.DeblockingFilter="去块滤波"
-AMF.H264.DeblockingFilter.Description="设置解码器允许使用的标记, 用于编码流去块滤波器."
-AMF.H264.ScanType="扫描类型"
-AMF.H264.ScanType.Description="使用哪种扫描方法, 通常设置为'Progressive'."
-AMF.H264.ScanType.Progressive="渐进"
-AMF.H264.ScanType.Interlaced="交错"
-AMF.H264.MotionEstimation="移动侦测"
-AMF.H264.MotionEstimation.Description="运动侦测允许编码器通过估计一个像素去了哪里来降低比特率."
-AMF.H264.MotionEstimation.None="无"
-AMF.H264.MotionEstimation.Half="半像素"
-AMF.H264.MotionEstimation.Quarter="四分之一像素"
-AMF.H264.MotionEstimation.Both="半&四分之一像素"
-AMF.H264.CodingType="编码类型"
-AMF.H264.CodingType.Description="使用哪种编码类型:\n* \@AMF.Util.Default\@ 让 AMF 决定(推荐).\n* CALVC(上下文自适应可变长度编码) 速度更快但是更大.\n* CABAC(上下文自适应二进制算术编码)速度更慢, 但是更小."
+Utility.Default="默认"
+Utility.Automatic="自动"
+Utility.Manual="手动"
+Utility.Switch.Disabled="禁用"
+Utility.Switch.Enabled="启用"
+Preset="预设"
+Preset.ResetToDefaults="重置为默认值"
+Preset.Recording="录像"
+Preset.HighQuality="高质量"
+Preset.Indistinguishable="无法区分"
+Preset.Lossless="无损"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="使用"
+Usage.Description="AMF 应调整为什么用法:\n-'\@Usage.Transcoding\@' 是通用转码(推荐),\n- '\@Usage.UltraLowLatency\@' 用于真正低延迟编码,\n-'\@Usage.LowLatency\@' 类似于上面稍高的延迟.\n 推流仅支持 '\@Usage.Transcoding\@', 所有其他值可以用于录制."
+Usage.Transcoding="转码"
+Usage.UltraLowLatency="超低延迟"
+Usage.LowLatency="低延迟"
+Usage.Webcam="网络摄像头"
+QualityPreset="质量预设"
+QualityPreset.Description="质量预设 AMF 应尝试向目标:\n-'\@QualityPreset.Speed\@' 是最快但最差的质量,\n-'\@QualityPreset.Balanced\@' 是平衡两者的混合,\n-'\@QualityPreset.Quality\@' 给出了最好的质量为给定的比特率."
+QualityPreset.Speed="速度"
+QualityPreset.Balanced="平衡"
+QualityPreset.Quality="质量"
+Profile="配置文件"
+Profile.Description="什么配置文件进行编码. 从最佳支持(顶部) 到最好的质量(底部) 排序."
+ProfileLevel="配置等级"
+ProfileLevel.Description="要使用哪些配置文件级别. 最好设置在 \@Utility.Automatic\@."
+Tier="层"
+Tier.Description="要在编码在哪一层. '高' 目标高的比特率/带宽使用, 而 '主' 针对主流媒体."
+AspectRatio="长宽比"
+AspectRatio.Description="哪个长宽比应写入输出文件."
+CodingType="编码类型"
+CodingType.Description="使用哪种编码类型:\n* '\@Utility.Automatic\@' 让 AMF 决定(推荐).\n* 'CALVC'(上下文自适应可变长度编码) 速度更快但是更大.\n* 'CABAC'(上下文自适应二进制算术编码)速度更慢, 但是更小."
+MaximumReferenceFrames="最大参考帧"
+MaximumReferenceFrames.Description="在大多数编码时多少帧编码器可能会引用, 对编码质量有直接影响."
+RateControlMethod="速率控制方法"
+RateControlMethod.Description="应该使用什么比率控制方法:\n- '\@RateControlMethod.CQP\@' 赋值固定的 I/P/B 帧 QP(量化参数) 值,\n- '\@RateControlMethod.CBR\@' 保持在一个给定的目标比特率(使用过滤数据)(推荐用于推流),\n- '\@RateControlMethod.VBR\@' 保持在给定的峰值比特率以下,\n- '\@RateControlMethod.VBRLAT\@' 保持接近目标比特率如果 GPU 延迟和负载, 否则将会使用更高的比特率(推荐用于录制)."
+RateControlMethod.CQP="恒定 QP (CQP)"
+RateControlMethod.CBR="固定比特率 (CBR)"
+RateControlMethod.VBR="变比特率 (峰值约束) (VBR)"
+RateControlMethod.VBRLAT="变比特率 (延迟约束) (VBR_LAT)"
+PrePassMode="前通模式"
+PrePassMode.Description="前通是次选的比特率发行通过, 允许更好发行在一个序列内的比特率, 然而效果可能会随着不同的卡而不同."
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (四分之一大小)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (半尺寸)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (全尺寸)"
+Bitrate.Target="目标比特率"
+Bitrate.Target.Description="尝试在整个序列中获取的比特率."
+Bitrate.Peak="峰值比特率"
+Bitrate.Peak.Description="尝试在整个序列中最大获取的比特率."
+QP.IFrame="I 帧 QP"
+QP.IFrame.Description="用于 I 帧的固定的 QP 值."
+QP.PFrame="P 帧 QP"
+QP.PFrame.Description="用于 P 帧的固定 QP 值."
+QP.BFrame="B 帧 QP"
+QP.BFrame.Description="用于 B 帧的固定的 QP 值."
+QP.Minimum="最低 QP"
+QP.Minimum.Description="在一个 I 帧中使用的最低 QP 值."
+QP.IFrame.Minimum="最小 I-帧 QP"
+QP.IFrame.Minimum.Description="在一个 I 帧中使用的最低 QP 值."
+QP.PFrame.Minimum="最小 I-帧 QP"
+QP.PFrame.Minimum.Description="在一个 P 帧中使用的最低 QP 值."
+QP.Maximum="最高 QP"
+QP.Maximum.Description="在一个帧中使用的最高 QP 值."
+QP.IFrame.Maximum="最大 I-帧 QP"
+QP.IFrame.Maximum.Description="在一个 I 帧中使用的最高 QP 值."
+QP.PFrame.Maximum="最大的 P 帧 QP"
+QP.PFrame.Maximum.Description="在一个 P 帧中使用的最高 QP 值."
+FillerData="填充数据"
+FillerData.Description="启用填充数据允许编码器至少保持 \@Bitrate.Target\@, 通过填满空信息序列中的剩余空间."
+FrameSkipping="跳过的帧"
+FrameSkipping.Description="跳过的帧允许编码器下丢帧, 以满足\@Bitrate.Target\@ 要求.\n当编码器丢帧时, 它反而插入一个重复最后一帧的 NAL 到流中.\n可以帮助非常低的\@Bitrate.Target\@."
+VBAQ="VBAQ"
+VBAQ.Description="启用'基于方差的自适应量化'(VBAQ), 基于像素的方差来更好地区分比特率.\n他基于原理人类的视觉系统是对高纹理的物体不敏感, 因而将推动比特率来实现更平滑的界面.\n启用这个可能会导致某些内容的主观质量的改善."
+EnforceHRD="强制 HRD"
+EnforceHRD.Description="强制使用假设的参考解码器, 用于验证输出码流是否正确."
+VBVBuffer="VBV 缓存"
+VBVBuffer.Description="应使用什么方法来确定 VBV 缓冲区大小:\n-'\@Utlity.Automatic\@' 计算使用严格约束的大小,\n- '\@Utlity.Manual\@' 允许用户控制大小.\nVBV缓存 (视频缓冲程序验证程序) 用于某些率控制方法来保持整体的比特率在给定的约束内."
+VBVBuffer.Strictness="VBV 缓存规范性"
+VBVBuffer.Strictness.Description="决定 VBV 缓存的严格性, 100% 表示尽可能严格, 0% 表示无限制."
+VBVBuffer.Size="VBV 缓存大小"
+VBVBuffer.Size.Description="用于在一个序列中的控制比特率的 VBV 缓存的大小."
+VBVBuffer.InitialFullness="VBV 缓冲区初始满"
+VBVBuffer.InitialFullness.Description="VBV 缓存最初应该多满(百分比), 将只会影响最初的序列的编码."
+KeyframeInterval="关键帧间隔"
+KeyframeInterval.Description="关键帧之间间隔(以秒为单位)."
+H264.IDRPeriod="IDR 周期 (帧)"
+H264.IDRPeriod.Description="定义帧内瞬时解码刷新 (IDR) 的距离. 也控制 GOP 序列长度."
+H265.IDRPeriod="IDR 周期 (GOPs)"
+H265.IDRPeriod.Description="定义 GOP 之间瞬时解码刷新 (IDR) 的距离。"
+GOP.Type="GOP 类型"
+GOP.Type.Description="应使用哪种类型的 GOP:\n-'\@GOP.Type.Fixed\@' 将始终使用固定的每个 GOP 之间的距离之.\n-'\@GOP.Type.Variable\@' 允许 GOPs 为不同值, 取决于什么是需要的.\n'\@GOP.Type.Fixed\@' 是 H264 实现如何工作和最好的本地网络推流, 而 '\@GOP.Type.Variable\@' 是最适合低大小高品质的录像."
+GOP.Type.Fixed="固定的"
+GOP.Type.Variable="可变的"
+GOP.Size="GOP 大小"
+GOP.Size.Description="帧中最大的 GOP (Group Of Pictures) 大小"
+GOP.Size.Minimum="GOP 大小最小值"
+GOP.Size.Minimum.Description="帧中最小的 GOP (Group Of Pictures) 大小"
+GOP.Size.Maximum="GOP 大小最大值"
+GOP.Size.Maximum.Description="帧中最大的 GOP (Group Of Pictures) 大小"
+GOP.Alignment="GOP 对齐"
+GOP.Alignment.Description="实验, 效果是未知. 使用风险由自己负责."
+BFrame.Pattern="B 帧模式"
+BFrame.Pattern.Description="在编码时使用多少 B 帧.\n2 代和 3 代 VCE 卡支持. 对编码性能有负面影响."
+BFrame.DeltaQP="B 帧差值 QP"
+BFrame.DeltaQP.Description="对于非参考 B 帧, 相对于上一个 I 或者 P 帧的差值 QP 值."
+BFrame.Reference="B 帧参考"
+BFrame.Reference.Description="允许 B 帧也使用 B 帧作为参考, 而不是只是 P 和 I 帧."
+BFrame.ReferenceDeltaQP="B 帧参考差值 QP"
+BFrame.ReferenceDeltaQP.Description="对于参考 B 帧, 相对于上一个 I 或者 P 帧的差值 QP 值."
+DeblockingFilter="去块滤波"
+DeblockingFilter.Description="允许解码器申请去块滤波器."
+MotionEstimation="移动侦测"
+MotionEstimation.Description="运动侦测允许编码器通过预测一个像素去了哪里来降低需要的比特率."
+MotionEstimation.Quarter="四分之一像素"
+MotionEstimation.Half="半像素"
+MotionEstimation.Full="四分之一 & 半像素"
+Video.API="视频的 API"
+Video.API.Description="后端应使用什么 API?"
+Video.Adapter="视频适配器"
+Video.Adapter.Description="我们应该尝试在什么适配器上编码?"
+OpenCL="OpenCL"
+OpenCL.Description="OpenCL 应该用于提交帧吗? 从技术上讲更快, 但英特尔驱动程序会导致出问题(由于不兼容的 OpenCL 库)."
+View="查看模式"
+View.Description="应显示哪些属性?\n使用 '\@View.Master\@' 会使你丧失获得技术支持的资格."
+View.Basic="基本"
+View.Advanced="高级"
+View.Expert="专家"
+View.Master="主"
+Debug="调试"
+Debug.Description="启用额外的调试消息. 需要你用 '--verbose --log_unfiltered'(删掉') 命令行运行打开OBS。"
AMF.H264.MaximumLTRFrames="最大的 LTR 帧"
AMF.H264.MaximumLTRFrames.Description="长期参考帧(LTR) 是一个特性允许编码器来标记在一个序列的一些帧作为参考帧一段时间.\nLTR 帧不能和 B 帧一起使用并且如果使用 LTR 帧, 编码器会禁用 B 帧."
AMF.H264.MaximumAccessUnitSize="最大访问单元大小"
AMF.H264.HeaderInsertionSpacing.Description="NAL 头之间应该多少帧. 不推荐改为非零(自动) 值."
AMF.H264.WaitForTask="等待任务"
AMF.H264.WaitForTask.Description="未知, 实验"
-AMF.H264.PreAnalysisPass="预分析通过"
-AMF.H264.PreAnalysisPass.Description="未知, 实验"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="未知, 实验"
-AMF.H264.GOPSize="GOP 大小"
-AMF.H264.GOPSize.Description="未知, 实验"
-AMF.H264.GOPAlignment="GOP 对齐"
-AMF.H264.GOPAlignment.Description="未知, 实验"
-AMF.H264.MaximumReferenceFrames="最大参考帧"
-AMF.H264.MaximumReferenceFrames.Description="未知, 实验"
AMF.H264.SlicesPerFrame="每帧的切片数"
AMF.H264.SlicesPerFrame.Description="每个帧应该和多少 I 帧切片一起存储?\n0 值让编码器运行时决定.\n内部刷新编码用于更快的回放和定位."
AMF.H264.SliceMode="切片模式"
AMF.H264.IntraRefresh.NumberOfStripes.Description="未知, 实验"
AMF.H264.IntraRefresh.MacroblocksPerSlot="每个插槽的帧内刷新宏"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="每个槽应该和多少宏块一起存储?\n0 值让编码器在运行时决定.\n帧内刷新编码用于更快的回放和定位."
-AMF.H264.VideoAPI="视频的 API"
-AMF.H264.VideoAPI.Description="要用于编码的 API。"
-AMF.H264.VideoAdapter="视频适配器"
-AMF.H264.VideoAdapter.Description="哪种适配器用来编码."
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="编码器应该使用 OpenCL 提交单独的帧吗>"
-AMF.H264.View="查看模式"
-AMF.H264.View.Description="哪些属性应该是可见的. 当使用 '专家' 或者 '大师' 模式时, 您将不会得到支持."
-AMF.H264.View.Basic="基本"
-AMF.H264.View.Advanced="高级"
-AMF.H264.View.Expert="专家"
-AMF.H264.View.Master="主"
-AMF.H264.Debug="调试"
-AMF.H264.Debug.Description="启用附加调试日志记录, 应该是启用的, 无论何时您需要此编码器的支持."
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Resources/locale/zh-TW.ini -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Resources/locale/zh-TW.ini
Changed
-AMF.Util.Default="預設"
-AMF.Util.Automatic="自動"
-AMF.Util.Manual="手動"
-AMF.Util.Toggle.Disabled="停用"
-AMF.Util.Toggle.Enabled="啟用"
-AMF.H264.Preset="預設"
-AMF.H264.Preset.ResetToDefaults="重設為預設值"
-AMF.H264.Preset.Recording="錄影"
-AMF.H264.Preset.HighQuality="高品質"
-AMF.H264.Preset.Indistinguishable="無法區分"
-AMF.H264.Preset.Lossless="無損"
-AMF.H264.Preset.Twitch="Twitch"
-AMF.H264.Preset.YouTube="YouTube"
-AMF.H264.Usage="用法"
-AMF.H264.Usage.Description="根據用途 AMF 應調整為︰ \n- '\@AMF.H264.Usage.Transcoding\@' 是通用轉碼 (推薦),\n- '\@AMF.H264.Usage.UltraLowLatency\@' 是為了真的非常低延遲的編碼,\n- '\@AMF.H264.Usage.LowLatency\@'是類似於上面但延遲稍高。 \n串流僅支援'\@AMF.H264.Usage.Transcoding\@',其他值可以用於錄影。"
-AMF.H264.Usage.Transcoding="轉碼"
-AMF.H264.Usage.UltraLowLatency="超低延遲"
-AMF.H264.Usage.LowLatency="低延遲"
-AMF.H264.QualityPreset="預設品質"
-AMF.H264.QualityPreset.Description="AMF 嘗試達成的預設品質:\n- '\@AMF.H264.QualityPreset.Speed\@' 為最快但最差的品質,\n- '\@AMF.H264.QualityPreset.Balanced\@' 為中間點,\n- '\@AMF.H264.QualityPreset.Quality\@' 在給定的位元速率下有最佳的品質。"
-AMF.H264.QualityPreset.Speed="速度"
-AMF.H264.QualityPreset.Balanced="平衡"
-AMF.H264.QualityPreset.Quality="品質"
-AMF.H264.Profile="設定檔"
-AMF.H264.Profile.Description="用於編碼的 H.264 設定:\n- 'Baseline' 有最廣泛的平台支援度,\n- 'Main' 較舊的裝置也支援 (如果目標是行動裝置,建議選此),\n- 'High' 當前主流裝置支援 (建議選此)。"
-AMF.H264.ProfileLevel="設定檔級別"
-AMF.H264.ProfileLevel.Description="用於編碼的 H.264 設定等級:\n- '\@AMF.Util.Automatic\@'根據給定的畫面播放速率以及尺寸計算最佳的設定等級,\n- '4.1'支援 1920x1080 30FPS, 1280x720 60FPS, 960x540 90FPS\n- '4.2'支援 1920x1080 60FPS, 1280x720 120FPS, 960x540 172FPS\n- '5.0'支援 1920x1080 60FPS, 1280x720 144FPS, 960x540 172FPS\n- '5.1'支援3840x2160 30FPS, 1920x1080 120FPS, 1280x720 172FPS, 960x540 172FPS\n- '5.2'支援3840x2160 60FPS, 1920x1080 172FPS, 1280x720 172FPS, 960x540 172FPS"
-AMF.H264.RateControlMethod="速率控制方法"
-AMF.H264.RateControlMethod.Description="該使用什麼位元速率控制方法:\n- '\@AMF.H264.RateControlMethod.CQP\@' 指定固定的 I-/P-/B-訊框 QP (量化參數)值,\n- '\@AMF.H264.RateControlMethod.CBR\@' 固定在給定的目標位元速率 (藉由填塞空白資料)(建議用於串流),\n- '\@AMF.H264.RateControlMethod.VBR\@' 保持低於給定的位元速率最大值,\n- '\@AMF.H264.RateControlMethod.VBR_LAT\@' 在 GPU 延遲跟負載允許的情況下盡量接近目標位元速率,不足時使用較高的位元速率(建議用於錄影)。"
-AMF.H264.RateControlMethod.CQP="固定 QP (CQP)"
-AMF.H264.RateControlMethod.CBR="固定位元率 (CBR)"
-AMF.H264.RateControlMethod.VBR.Peak="可變位元率 (限制最大位元率) (VBR)"
-AMF.H264.RateControlMethod.VBR.Latency="可變位元率 (限制延遲) (VBR)"
-AMF.H264.Bitrate.Target="目標位元率"
-AMF.H264.Bitrate.Target.Description="在整體過程中嘗試達成的位元速率。"
-AMF.H264.Bitrate.Peak="最大位元率"
-AMF.H264.Bitrate.Peak.Description="在整體過程中嘗試不超過的位元速率。"
-AMF.H264.QP.Minimum="最低 QP 值"
-AMF.H264.QP.Minimum.Description="一個訊框中使用的最低 QP (量化參數) 值。"
-AMF.H264.QP.Maximum="最大 QP 值"
-AMF.H264.QP.Maximum.Description="一個訊框中使用的最高 QP (量化參數) 值。"
-AMF.H264.QP.IFrame="I-訊框 QP"
-AMF.H264.QP.IFrame.Description="I 訊框所使用的固定 QP 值"
-AMF.H264.QP.PFrame="P-訊框 QP"
-AMF.H264.QP.PFrame.Description="P 訊框所使用的固定 QP 值"
-AMF.H264.QP.BFrame="B-訊框 QP"
-AMF.H264.QP.BFrame.Description="B-訊框使用的固定 QP (量化參數) 值。"
-AMF.H264.VBVBuffer="VBV(Video Buffering Verifier) 緩衝區"
-AMF.H264.VBVBuffer.Description="決定 VBV 緩衝區大小的方法:\n- '\@AMF.Util.Automatic\@' 根據嚴格性計算大小,\n- '\@AMF.Util.Manual\@' 允許使用者調整大小。\n 某些速率控制方法會使用 VBV (Video Buffering Verifier) 緩衝區讓整體的位元速率保持在給定的條件之內。"
-AMF.H264.VBVBuffer.Strictness="VBV 緩衝區嚴格性"
-AMF.H264.VBVBuffer.Strictness.Description="決定 VBV 緩衝區的嚴密性,100% 表示盡可能的嚴格,0% 表示完全不限制。"
-AMF.H264.VBVBuffer.Size="VBV 緩衝區大小"
-AMF.H264.VBVBuffer.Size.Description="VBV緩衝區的大小,這將用於過程中的位元速率控制"
-AMF.H264.VBVBuffer.Fullness="初始 VBV 緩衝區填充度"
-AMF.H264.VBVBuffer.Fullness.Description="VBV 緩衝區起始該多滿,將只影響編碼的起始。"
-AMF.H264.FillerData="填塞空白資料"
-AMF.H264.FillerData.Description="啟用填塞空白資料允許編碼器保持位元速率至少有目標位元速率,編碼器將會將不足的位元用空白訊息填滿"
-AMF.H264.FrameSkipping="省略訊框"
-AMF.H264.FrameSkipping.Description="省略訊框允許編碼器在為了達成目標位元速率時省略部份訊框。\n當編碼器省略訊框時,他將插入重複上一個訊框的指令。\n對極低的目標位元速率有幫助。"
-AMF.H264.EnforceHRDCompatibility="與 HRD 相容"
-AMF.H264.EnforceHRDCompatibility.Description="執行 Hypothetical Reference Decoder 限制,這將限制單一訊框內 QP 值的最大變化量。"
-AMF.H264.KeyframeInterval="關鍵訊框間隔"
-AMF.H264.KeyframeInterval.Description="以秒為單位,定義關鍵訊框之間的距離。同時也控制了 GOP 序列的大小。"
-AMF.H264.IDRPeriod="IDR 周期"
-AMF.H264.IDRPeriod.Description="以訊框為單位,定義瞬間解碼重新更新(Instantaneous Decoding Refreshes)間的距離。同時也控制了 GOP 序列的大小。"
-AMF.H264.BFrame.Pattern="B 訊框"
-AMF.H264.BFrame.Pattern.Description="編碼時使用的 B 訊框數量。\n搭載第二代和第三代影像編碼引擎的顯示卡支援此參數。對編碼效能會有不良影響。"
-AMF.H264.BFrame.DeltaQP="B 訊框的 QP 值變化"
-AMF.H264.BFrame.DeltaQP.Description="在沒使用可參照 B 訊框時, B 訊框與上一個I或P訊框間的QP值變化。"
-AMF.H264.BFrame.Reference="可參照 B 訊框"
-AMF.H264.BFrame.Reference.Description="允許 B 訊框也使用 B 訊框作為參照,而不限於 P 和 I 訊框。"
-AMF.H264.BFrame.ReferenceDeltaQP="可參照 B 訊框的 QP 值變化"
-AMF.H264.BFrame.ReferenceDeltaQP.Description="在使用可參照 B 訊框時,B 訊框與上一個 I 或 P 訊框間的 QP 值變化。"
-AMF.H264.DeblockingFilter="去塊狀色斑濾鏡"
-AMF.H264.DeblockingFilter.Description="設定選項,讓解碼器可以使用去塊狀色斑濾鏡。"
-AMF.H264.ScanType="掃描類型"
-AMF.H264.ScanType.Description="掃描方式,請確保此值為'\@AMF.H264.ScanType.Progressive\@'。"
-AMF.H264.ScanType.Progressive="漸進式"
-AMF.H264.ScanType.Interlaced="交错式"
-AMF.H264.MotionEstimation="動態估算"
-AMF.H264.MotionEstimation.Description="動態估算讓編碼器藉由估算像素移動來減少位元速率。"
-AMF.H264.MotionEstimation.None="無"
-AMF.H264.MotionEstimation.Half="半像素"
-AMF.H264.MotionEstimation.Quarter="四分之一像素"
-AMF.H264.MotionEstimation.Both="半 & 四分之一像素"
-AMF.H264.CodingType="Coding Type"
-AMF.H264.CodingType.Description="Which type of coding to use:\n* \@AMF.Util.Default\@ lets AMF decide (recommended).\n* CALVC (Context-Adaptive Variable-Length Coding) is faster, but larger.\n* CABAC (Context-Adaptive Binary Arithmetic Coding) is slower, but smaller."
+Utility.Default="預設"
+Utility.Automatic="自動"
+Utility.Manual="自訂"
+Utility.Switch.Disabled="已停用"
+Utility.Switch.Enabled="已啟用"
+Preset="預設"
+Preset.ResetToDefaults="重設為預設值"
+Preset.Recording="錄影"
+Preset.HighQuality="高畫質"
+Preset.Indistinguishable="無法區分"
+Preset.Lossless="無損"
+Preset.Twitch="Twitch"
+Preset.YouTube="YouTube"
+Usage="使用情況"
+Usage.Description="根據用途 AMF 應調整為︰ \n- '\@Usage.Transcoding\@' 是通用轉碼 (推薦),\n- '\@Usage.UltraLowLatency\@' 是為了真的非常低延遲的編碼,\n- '\@Usage.LowLatency\@'是類似於上面但延遲稍高。 \n串流僅支援'\@Usage.Transcoding\@',其他值可以用於錄影。"
+Usage.Transcoding="轉碼"
+Usage.UltraLowLatency="超低延遲"
+Usage.LowLatency="低延遲"
+Usage.Webcam="網路攝影機"
+QualityPreset="品質預設"
+QualityPreset.Description="AMF 嘗試達成的預設品質:\n- '\@QualityPreset.Speed\@' 為最快但最差的品質,\n- '\@QualityPreset.Balanced\@' 為中間點,\n- '\@QualityPreset.Quality\@' 在給定的位元速率下有最佳的品質。"
+QualityPreset.Speed="速度"
+QualityPreset.Balanced="平衡"
+QualityPreset.Quality="品質"
+Profile="設定"
+Profile.Description="使用哪種設定進行編碼。從最廣泛被支援(頂部)到最好的畫質(底部)排序。"
+ProfileLevel="設定級別"
+ProfileLevel.Description="使用哪種設定級別。建議保持設定為\@Utility.Automatic\@。"
+Tier="Tier"
+Tier.Description="對哪種 Tier 編碼。'High' 針對高位元率/頻寬的情境,'Main' 針對主流媒介。"
+AspectRatio="長寬比"
+AspectRatio.Description="寫進輸出檔案的長寬比"
+CodingType="編碼類型"
+CodingType.Description="使用哪種類型的編碼︰ \n* '\@Utility.Automatic\@' 讓 AMF 決定 (推薦)。 \n* 'CALVC' (上下文自我調整可變長度編碼) 較快,但產出較大的檔案。 \n* 'CABAC' (上下文自我調整二進位算術編碼) 較慢,但較小。"
+MaximumReferenceFrames="最大參考訊框"
+MaximumReferenceFrames.Description="編碼器在編碼時最多可參照的訊框數,直接影響編碼品質。"
+RateControlMethod="速率控制方法"
+RateControlMethod.Description="該使用什麼位元速率控制方法:\n- '\@RateControlMethod.CQP\@' 指定固定的 I-/P-/B-訊框 QP (量化參數)值,\n- '\@RateControlMethod.CBR\@' 固定在給定的目標位元速率 (藉由填塞空白資料)(建議用於串流),\n- '\@RateControlMethod.VBR\@' 保持低於給定的位元速率最大值,\n- '\@RateControlMethod.VBRLAT\@' 在 GPU 延遲跟負載允許的情況下盡量接近目標位元速率,不足時使用較高的位元速率(建議用於錄影)。"
+RateControlMethod.CQP="恒定 QP (CQP)"
+RateControlMethod.CBR="固定位元率 (CBR)"
+RateControlMethod.VBR="可變位元率 (限制最大位元率) (VBR)"
+RateControlMethod.VBRLAT="可變位元率 (限制延遲) (VBR)"
+PrePassMode="前置階段模式"
+PrePassMode.Description="前置階段是次要的位元率分配階段,此階段讓過程內的位元率能有更好的分配,然而這個效果對每個顯示卡都有所不同。"
+PrePassMode.Quarter="\@Utility.Switch.Enabled\@ (四分之一大小)"
+PrePassMode.Half="\@Utility.Switch.Enabled\@ (半尺寸)"
+PrePassMode.Full="\@Utility.Switch.Enabled\@ (全尺寸)"
+Bitrate.Target="目標位元率"
+Bitrate.Target.Description="在整體過程中嘗試達成的位元速率。"
+Bitrate.Peak="最大位元率"
+Bitrate.Peak.Description="在整體過程中嘗試不超過的位元速率。"
+QP.IFrame="I-訊框 QP"
+QP.IFrame.Description="I 訊框所使用的固定 QP 值"
+QP.PFrame="P-訊框 QP"
+QP.PFrame.Description="P 訊框所使用的固定 QP 值"
+QP.BFrame="B-訊框 QP"
+QP.BFrame.Description="B 訊框所使用的固定 QP 值"
+QP.Minimum="最低 QP 值"
+QP.Minimum.Description="在一個訊框中使用的最低 QP 值。"
+QP.IFrame.Minimum="最低 I-訊框 QP"
+QP.IFrame.Minimum.Description="在一個 I-訊框中使用的最低 QP 值。"
+QP.PFrame.Minimum="最低 P-訊框 QP"
+QP.PFrame.Minimum.Description="在一個 P-訊框中使用的最低 QP 值。"
+QP.Maximum="最大 QP 值"
+QP.Maximum.Description="在一個訊框中使用的最高 QP 值。"
+QP.IFrame.Maximum="最大 I-訊框 QP"
+QP.IFrame.Maximum.Description="在一個 I-訊框中使用的最高 QP 值。"
+QP.PFrame.Maximum="最大 P-訊框 QP"
+QP.PFrame.Maximum.Description="在一個 P-訊框中使用的最高 QP 值。"
+FillerData="填塞空白資料"
+FillerData.Description="啟用填塞空白資料允許編碼器保持位元速率至少有\@Bitrate.Target\@,編碼器將會將不足的位元用空白訊息填滿"
+FrameSkipping="省略訊框"
+FrameSkipping.Description="省略訊框允許編碼器在為了達成\@Bitrate.Target\@時省略部份訊框。\n當編碼器省略訊框時,他將插入重複上一個訊框的指令。\n對極低的\@Bitrate.Target\@有幫助。"
+VBAQ="VBAQ"
+EnforceHRD="強制使用 HRD"
+EnforceHRD.Description="強制使用 Hypothetical Reference Decoder,HRD 用於驗證輸出位元流是否正確。"
+VBVBuffer="VBV(Video Buffering Verifier) 緩衝區"
+VBVBuffer.Description="決定 VBV 緩衝區大小的方法:\n- '\@Utility.Automatic\@' 根據嚴格性計算大小,\n- '\@Utility.Manual\@' 允許使用者調整大小。\n 某些速率控制方法會使用 VBV (Video Buffering Verifier) 緩衝區讓整體的位元速率保持在給定的條件之內。"
+DeblockingFilter="去塊狀色斑濾鏡"
+DeblockingFilter.Description="允許解碼器使用去塊狀色斑濾鏡。"
+MotionEstimation="動態估算"
+MotionEstimation.Description="動態估算讓編碼器藉由估算像素移動來減少位元速率。"
+MotionEstimation.Quarter="四分之一像素"
+MotionEstimation.Half="半像素"
+MotionEstimation.Full="四分之一與半像素"
+Video.API="影像 API"
+Video.API.Description="後端應使用什麼 API?"
+Video.Adapter="顯示卡"
+Video.Adapter.Description="該嘗試用那個顯示卡編碼?"
+OpenCL="OpenCL"
+OpenCL.Description="使用 OpenCL 提交訊框?技術上更快,但 (由於不相容的 OpenCL 函式庫) 會跟 Intel 驅動程式相衝。"
+View="檢視模式"
+View.Description="該顯示哪些屬性?\n使用'\@View.Master\@'時將不提供任何的支援。"
+View.Basic="基本"
+View.Advanced="進階"
+View.Expert="專家"
+View.Master="上帝模式"
+Debug="除錯模式"
+Debug.Description="啟用額外的除錯訊息。需要用 '--verbose --log_unfiltered' 命令列來啟動 OBS"
AMF.H264.MaximumLTRFrames="Maximum LTR Frames"
AMF.H264.MaximumLTRFrames.Description="Long Term Reference (LTR) Frames are a feature that allows the encoder to flag certain frames in a sequence as referencable for a long time.\nLTR Frames can't be used with B-Frames and the encoder will disable B-Frames if these are used."
AMF.H264.MaximumAccessUnitSize="Maximum Access Unit Size"
AMF.H264.HeaderInsertionSpacing.Description="How many frames should be between NAL headers."
AMF.H264.WaitForTask="Wait For Task"
AMF.H264.WaitForTask.Description="Unknown, Experimental"
-AMF.H264.PreAnalysisPass="預分析階段"
-AMF.H264.PreAnalysisPass.Description="Unknown, Experimental"
-AMF.H264.VBAQ="VBAQ"
-AMF.H264.VBAQ.Description="Unknown, Experimental"
-AMF.H264.GOPSize="GOP Size"
-AMF.H264.GOPSize.Description="Unknown, Experimental"
-AMF.H264.GOPAlignment="GOP Alignment"
-AMF.H264.GOPAlignment.Description="Unknown, Experimental"
-AMF.H264.MaximumReferenceFrames="Maximum Reference Frames"
-AMF.H264.MaximumReferenceFrames.Description="Unknown, Experimental"
AMF.H264.SlicesPerFrame="Slices Per Frame"
AMF.H264.SlicesPerFrame.Description="How many I-Frame slices should be stored with each frame?\nA value of zero lets the encoder decide on the fly.\nIntra-Refresh encoding is used for faster playback and seeking."
AMF.H264.SliceMode="Slice Mode"
AMF.H264.IntraRefresh.NumberOfStripes.Description="Unknown, Experimental"
AMF.H264.IntraRefresh.MacroblocksPerSlot="Intra-Refresh Macroblocks per Slot"
AMF.H264.IntraRefresh.MacroblocksPerSlot.Description="How many Macroblocks should be stored in each slot?\nA value of 0 disables this feature.\nIntra-Refresh encoding is used for faster playback and seeking."
-AMF.H264.VideoAPI="影像 API"
-AMF.H264.VideoAPI.Description="用於編碼的API。"
-AMF.H264.VideoAdapter="顯示卡"
-AMF.H264.VideoAdapter.Description="用於編碼的顯示卡。"
-AMF.H264.OpenCL="OpenCL"
-AMF.H264.OpenCL.Description="編碼器應該使用 OpenCL 來發送各個訊框?"
-AMF.H264.View="檢視模式"
-AMF.H264.View.Description="該顯示哪些屬性?\n使用'\@AMF.H264.View.Master\@'時將不提供任何的支援。"
-AMF.H264.View.Basic="基本"
-AMF.H264.View.Advanced="進階"
-AMF.H264.View.Expert="專家"
-AMF.H264.View.Master="上帝模式"
-AMF.H264.Debug="除錯模式"
-AMF.H264.Debug.Description="啟用額外的除錯紀錄,當需要任何關於此編碼器的支援時皆應啟用。"
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/amf-capabilities.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/amf-capabilities.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-#pragma once
-
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-#include <string>
-#include <sstream>
-
-#ifdef _WIN32
-#include <windows.h>
-#include <VersionHelpers.h>
-#endif
-
#include "amf-capabilities.h"
-#include "api-d3d11.h"
-#include "api-d3d9.h"
-#include "misc-util.cpp"
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
-
-Plugin::AMD::VCEDeviceCapabilities::VCEDeviceCapabilities() {
- acceleration_type = amf::AMF_ACCEL_NOT_SUPPORTED;
- maxProfile =
- maxProfileLevel =
- maxBitrate =
- minReferenceFrames =
- maxReferenceFrames =
- maxTemporalLayers =
- maxNumOfStreams =
- maxNumOfHwInstances =
- input.minWidth =
- input.maxWidth =
- input.minHeight =
- input.maxHeight =
- input.verticalAlignment =
- output.minWidth =
- output.maxWidth =
- output.minHeight =
- output.maxHeight =
- output.verticalAlignment = 0;
- supportsBFrames =
- supportsFixedSliceMode =
- input.supportsInterlaced =
- output.supportsInterlaced = false;
- input.formats.clear();
- input.memoryTypes.clear();
- output.formats.clear();
- output.memoryTypes.clear();
+#include "utility.h"
+
+//#ifdef _WIN32
+//#include <windows.h>
+//#include <VersionHelpers.h>
+//#endif
+
+using namespace Plugin;
+using namespace Plugin::AMD;
+
+#pragma region Singleton
+static CapabilityManager* __instance;
+static std::mutex __instance_mutex;
+void Plugin::AMD::CapabilityManager::Initialize() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
+ if (!__instance)
+ __instance = new CapabilityManager();
}
-std::shared_ptr<Plugin::AMD::VCECapabilities> Plugin::AMD::VCECapabilities::GetInstance() {
- static std::shared_ptr<VCECapabilities> __instance = std::make_shared<VCECapabilities>();
- static std::mutex __mutex;
-
- const std::lock_guard<std::mutex> lock(__mutex);
+CapabilityManager* Plugin::AMD::CapabilityManager::Instance() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
return __instance;
}
-void Plugin::AMD::VCECapabilities::ReportCapabilities(std::shared_ptr<Plugin::API::Base> api) {
- auto inst = GetInstance();
- auto adapters = api->EnumerateAdapters();
- for (auto adapter : adapters) {
- ReportAdapterCapabilities(api, adapter);
- }
-}
-
-void Plugin::AMD::VCECapabilities::ReportAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter) {
- auto inst = GetInstance();
- H264EncoderType types[] = {
- H264EncoderType::AVC,
- H264EncoderType::SVC,
- };
-
- AMF_LOG_INFO("Capabilities for %s adapter '%s':",
- api->GetName().c_str(),
- adapter.Name.c_str());
-
- for (auto type : types) {
- ReportAdapterTypeCapabilities(api, adapter, type);
- }
-}
-
-void Plugin::AMD::VCECapabilities::ReportAdapterTypeCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter, H264EncoderType type) {
- auto inst = GetInstance();
- auto caps = inst->GetAdapterCapabilities(api, adapter, type);
-
- AMF_LOG_INFO(" %s (Acceleration: %s)",
- (type == H264EncoderType::AVC ? "AVC" : (type == H264EncoderType::SVC ? "SVC" : "Unknown")),
- (caps.acceleration_type == amf::AMF_ACCEL_SOFTWARE ? "Software" : (caps.acceleration_type == amf::AMF_ACCEL_GPU ? "GPU" : (caps.acceleration_type == amf::AMF_ACCEL_HARDWARE ? "Hardware" : "None")))
- );
-
- if (caps.acceleration_type == amf::AMF_ACCEL_NOT_SUPPORTED)
- return;
-
- AMF_LOG_INFO(" Limits");
- AMF_LOG_INFO(" Profile: %s", Plugin::Utility::ProfileAsString((H264Profile)caps.maxProfile));
- AMF_LOG_INFO(" Profile Level: %ld.%ld", caps.maxProfileLevel / 10, caps.maxProfileLevel % 10);
- AMF_LOG_INFO(" Bitrate: %ld", caps.maxBitrate);
- AMF_LOG_INFO(" Reference Frames: %ld (min) - %ld (max)", caps.minReferenceFrames, caps.maxReferenceFrames);
- if (type == H264EncoderType::SVC)
- AMF_LOG_INFO(" Temporal Layers: %ld", caps.maxTemporalLayers);
- AMF_LOG_INFO(" Features");
- AMF_LOG_INFO(" B-Frames: %s", caps.supportsBFrames ? "Supported" : "Not Supported");
- AMF_LOG_INFO(" Fixed Slice Mode: %s", caps.supportsFixedSliceMode ? "Supported" : "Not Supported");
- AMF_LOG_INFO(" Instancing");
- AMF_LOG_INFO(" # of Streams: %ld", caps.maxNumOfStreams);
- AMF_LOG_INFO(" # of Instances: %ld", caps.maxNumOfHwInstances);
- AMF_LOG_INFO(" Input");
- ReportAdapterTypeIOCapabilities(api, adapter, type, false);
- AMF_LOG_INFO(" Output");
- ReportAdapterTypeIOCapabilities(api, adapter, type, true);
-}
-
-void Plugin::AMD::VCECapabilities::ReportAdapterTypeIOCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter, H264EncoderType type, bool output) {
- auto amf = Plugin::AMD::AMF::GetInstance();
-
- auto inst = GetInstance();
- VCEDeviceCapabilities::IOCaps ioCaps = output
- ? inst->GetAdapterCapabilities(api, adapter, type).output
- : inst->GetAdapterCapabilities(api, adapter, type).input;
- AMF_LOG_INFO(" Resolution: %ldx%ld - %ldx%ld",
- ioCaps.minWidth, ioCaps.minHeight,
- ioCaps.maxWidth, ioCaps.maxHeight);
- AMF_LOG_INFO(" Vertical Alignment: %ld", ioCaps.verticalAlignment);
- AMF_LOG_INFO(" Interlaced: %s", ioCaps.supportsInterlaced ? "Supported" : "Not Supported");
- std::stringstream formatstr;
- for (auto format : ioCaps.formats) {
- std::vector<char> buf(1024);
- wcstombs(buf.data(), amf->GetTrace()->SurfaceGetFormatName(format.first), 1024);
- formatstr
- << buf.data()
- << (format.second ? " (Native)" : "")
- << ", ";
- }
- AMF_LOG_INFO(" Formats: %s", formatstr.str().c_str());
- std::stringstream memorystr;
- for (auto memory : ioCaps.memoryTypes) {
- std::vector<char> buf(1024);
- wcstombs(buf.data(), amf->GetTrace()->GetMemoryTypeName(memory.first), 1024);
- memorystr
- << buf.data()
- << (memory.second ? " (Native)" : "")
- << ", ";
- }
- AMF_LOG_INFO(" Memory Types: %s", memorystr.str().c_str());
-}
-
-Plugin::AMD::VCECapabilities::VCECapabilities() {
- this->Refresh();
-}
-
-Plugin::AMD::VCECapabilities::~VCECapabilities() {}
-
-static AMF_RESULT GetIOCapability(bool output, amf::AMFCapsPtr amfCaps, Plugin::AMD::VCEDeviceCapabilities::IOCaps* caps) {
- AMF_RESULT res = AMF_OK;
- amf::AMFIOCapsPtr amfIOCaps;
- if (output)
- res = amfCaps->GetOutputCaps(&amfIOCaps);
- else
- res = amfCaps->GetInputCaps(&amfIOCaps);
- if (res != AMF_OK)
- return res;
-
- amfIOCaps->GetWidthRange(&(caps->minWidth), &(caps->maxWidth));
- amfIOCaps->GetHeightRange(&(caps->minHeight), &(caps->maxHeight));
- caps->supportsInterlaced = amfIOCaps->IsInterlacedSupported();
- caps->verticalAlignment = amfIOCaps->GetVertAlign();
-
- int32_t numFormats = amfIOCaps->GetNumOfFormats();
- caps->formats.resize(numFormats);
- for (int32_t formatIndex = 0; formatIndex < numFormats; formatIndex++) {
- amf::AMF_SURFACE_FORMAT format = amf::AMF_SURFACE_UNKNOWN;
- bool isNative = false;
-
- amfIOCaps->GetFormatAt(formatIndex, &format, &isNative);
- caps->formats[formatIndex].first = format;
- caps->formats[formatIndex].second = isNative;
- }
-
- int32_t numMemoryTypes = amfIOCaps->GetNumOfMemoryTypes();
- caps->memoryTypes.resize(numMemoryTypes);
- for (int32_t memoryTypeIndex = 0; memoryTypeIndex < numMemoryTypes; memoryTypeIndex++) {
- amf::AMF_MEMORY_TYPE type = amf::AMF_MEMORY_UNKNOWN;
- bool isNative = false;
-
- amfIOCaps->GetMemoryTypeAt(memoryTypeIndex, &type, &isNative);
- caps->memoryTypes[memoryTypeIndex].first = type;
- caps->memoryTypes[memoryTypeIndex].second = isNative;
- }
-
- return AMF_OK;
+void Plugin::AMD::CapabilityManager::Finalize() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
+ if (__instance)
+ delete __instance;
+ __instance = nullptr;
}
-
-bool Plugin::AMD::VCECapabilities::Refresh() {
- AMF_RESULT res;
-
- auto amfInstance = AMD::AMF::GetInstance();
- auto amfFactory = amfInstance->GetFactory();
-
- auto APIs = Plugin::API::Base::EnumerateAPIs();
- for (auto api : APIs) {
- std::vector<Plugin::API::Adapter> adapters;
- try {
- adapters = api->EnumerateAdapters();
- } catch (std::exception e) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Exception while enumerating %s adapters: %s.",
- api->GetName().c_str(),
- e.what());
- continue;
- } catch (...) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Critical exception while enumerating %s adapters.",
- api->GetName().c_str());
- throw;
- }
-
- for (auto adapter : adapters) {
- // Create AMF Instance
- amf::AMFContextPtr amfContext;
- res = amfFactory->CreateContext(&amfContext);
- if (res != AMF_OK) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Unable to create context on %s adapter '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- amfInstance->GetTrace()->GetResultText(res),
- res);
- continue;
- }
-
- void* apiInst = nullptr;
- try {
- apiInst = api->CreateInstanceOnAdapter(adapter);
- } catch (std::exception e) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Exception while intializing %s adapter '%s': %s.",
- api->GetName().c_str(),
- adapter.Name,
- e.what());
- continue;
- } catch (...) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Critical exception while intializing %s adapter '%s'.",
- api->GetName().c_str(),
- adapter.Name);
- throw;
- }
- switch (api->GetType()) {
- case Plugin::API::Type::Direct3D11:
- res = amfContext->InitDX11(api->GetContextFromInstance(apiInst));
- break;
- case Plugin::API::Type::Direct3D9:
- res = amfContext->InitDX9(api->GetContextFromInstance(apiInst));
- break;
- case Plugin::API::Type::OpenGL:
- res = amfContext->InitOpenGL(api->GetContextFromInstance(apiInst), nullptr, nullptr);
- break;
- default:
- res = AMF_OK;
- break;
- }
- if (res != AMF_OK) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Initialization failed for %s adapter '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- amfInstance->GetTrace()->GetResultText(res),
- res);
- continue;
- }
-
- H264EncoderType types[] = {
- H264EncoderType::AVC,
- H264EncoderType::SVC
- };
- for (auto type : types) {
- VCEDeviceCapabilities devCaps = VCEDeviceCapabilities();
-
- amf::AMFComponentPtr amfComponent;
- res = amfFactory->CreateComponent(amfContext,
- Plugin::Utility::VCEEncoderTypeAsAMF(type),
- &amfComponent);
- if (res != AMF_OK) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Unable to create component for %s adapter '%s' with codec '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- Plugin::Utility::VCEEncoderTypeAsString(type),
- amfInstance->GetTrace()->GetResultText(res), res);
- continue;
- }
-
- amf::AMFCapsPtr amfCaps;
- res = amfComponent->GetCaps(&amfCaps);
- if (res != AMF_OK) {
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Unable to query capabilities for %s adapter '%s' with codec '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- Plugin::Utility::VCEEncoderTypeAsString(type),
- amfInstance->GetTrace()->GetResultText(res), res);
- amfComponent->Terminate();
- continue;
- }
-
- devCaps.acceleration_type = amfCaps->GetAccelerationType();
- if (devCaps.acceleration_type != amf::AMF_ACCEL_NOT_SUPPORTED) {
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MAX_BITRATE, &(devCaps.maxBitrate));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_NUM_OF_STREAMS, &(devCaps.maxNumOfStreams));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MAX_PROFILE, &(devCaps.maxProfile));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MAX_LEVEL, &(devCaps.maxProfileLevel));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_BFRAMES, &(devCaps.supportsBFrames));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MIN_REFERENCE_FRAMES, &(devCaps.minReferenceFrames));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MAX_REFERENCE_FRAMES, &(devCaps.maxReferenceFrames));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_MAX_TEMPORAL_LAYERS, &(devCaps.maxTemporalLayers));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_FIXED_SLICE_MODE, &(devCaps.supportsFixedSliceMode));
- amfCaps->GetProperty(AMF_VIDEO_ENCODER_CAP_NUM_OF_HW_INSTANCES, &(devCaps.maxNumOfHwInstances));
-
- res = GetIOCapability(false, amfCaps, &(devCaps.input));
- if (res != AMF_OK)
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Unable to query input capabilities for %s adapter '%s' with codec '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- Plugin::Utility::VCEEncoderTypeAsString(type),
- amfInstance->GetTrace()->GetResultText(res), res);
-
- res = GetIOCapability(true, amfCaps, &(devCaps.output));
- if (res != AMF_OK)
- AMF_LOG_ERROR("<" __FUNCTION_NAME__ "> Unable to query output capabilities for %s adapter '%s' with codec '%s', error %ls (code %d).",
- api->GetName().c_str(),
- adapter.Name.c_str(),
- Plugin::Utility::VCEEncoderTypeAsString(type),
- amfInstance->GetTrace()->GetResultText(res), res);
+#pragma endregion Singleton
+
+Plugin::AMD::CapabilityManager::CapabilityManager() {
+ // Key order: API, Adapter, Codec
+ for (auto api : API::EnumerateAPIs()) {
+ for (auto adapter : api->EnumerateAdapters()) {
+ for (auto codec : { Codec::AVC/*, Codec::SVC*/, Codec::HEVC }) {
+ bool isSupported = false;
+ try {
+ std::unique_ptr<AMD::Encoder> enc;
+
+ #ifdef WITH_AVC
+ if (codec == Codec::AVC || codec == Codec::SVC) {
+ enc = std::make_unique<AMD::EncoderH264>(api, adapter);
+ }
+ #endif
+ #ifdef WITH_HEVC
+ if (codec == Codec::HEVC) {
+ enc = std::make_unique<AMD::EncoderH265>(api, adapter);
+ }
+ #endif
+ if (enc != nullptr)
+ isSupported = true;
+ } catch (const std::exception& e) {
+ PLOG_WARNING("%s", e.what());
}
- amfComponent->Terminate();
+ PLOG_DEBUG("[Capability Manager] Testing %s Adapter '%s' with codec %s: %s.",
+ api->GetName().c_str(), adapter.Name.c_str(), Utility::CodecToString(codec),
+ isSupported ? "Supported" : "Not Supported");
- // Insert
- capabilityMap.insert(std::make_pair(
- std::make_tuple(api->GetName(), adapter, type),
- devCaps)
- );
+ std::tuple<API::Type, API::Adapter, AMD::Codec> key = std::make_tuple(api->GetType(), adapter, codec);
+ m_CapabilityMap[key] = isSupported;
}
- api->DestroyInstance(apiInst);
- amfContext->Terminate();
}
}
- return true;
}
-std::vector<std::pair<H264EncoderType, VCEDeviceCapabilities>> Plugin::AMD::VCECapabilities::GetAllAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter) {
- std::vector<std::pair<H264EncoderType, VCEDeviceCapabilities>> caps;
- for (auto kv : capabilityMap) {
- auto apiName = std::get<0>(kv.first);
- auto curAdapter = std::get<1>(kv.first);
+Plugin::AMD::CapabilityManager::~CapabilityManager() {}
- if (apiName != api->GetName())
- continue;
- if (curAdapter != adapter)
- continue;
-
- auto type = std::get<2>(kv.first);
- caps.push_back(std::make_pair(type, kv.second));
+bool Plugin::AMD::CapabilityManager::IsCodecSupported(AMD::Codec codec) {
+ for (auto api : API::EnumerateAPIs()) {
+ if (IsCodecSupportedByAPI(codec, api->GetType()))
+ return true;
}
- return caps;
+ return false;
}
-Plugin::AMD::VCEDeviceCapabilities Plugin::AMD::VCECapabilities::GetAdapterCapabilities(std::shared_ptr<Plugin::API::Base> api, Plugin::API::Adapter adapter, H264EncoderType type) {
- auto key = std::make_tuple(api->GetName(), adapter, type);
-
- if (capabilityMap.count(key) == 0)
- return VCEDeviceCapabilities();
-
- return capabilityMap.find(key)->second;
+bool Plugin::AMD::CapabilityManager::IsCodecSupportedByAPI(AMD::Codec codec, API::Type type) {
+ auto api = API::GetAPI(type);
+ for (auto adapter : api->EnumerateAdapters()) {
+ if (IsCodecSupportedByAPIAdapter(codec, type, adapter) == true)
+ return true;
+ }
+ return false;
}
+bool Plugin::AMD::CapabilityManager::IsCodecSupportedByAPIAdapter(AMD::Codec codec, API::Type api, API::Adapter adapter) {
+ return m_CapabilityMap[std::make_tuple(api, adapter, codec)];
+}
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/amf-encoder-h264.cpp
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#include "amf-encoder-h264.h"
+#include "utility.h"
+#include <inttypes.h>
+
+#define QUICK_THROW_ERROR(format, ...) {\
+ QUICK_FORMAT_MESSAGE(errMsg, __FUNCTION_NAME__ format, \
+ m_UniqueId, ##__VA_ARGS__, \
+ m_AMF->GetTrace()->GetResultText(res), res); \
+ throw std::exception(errMsg.c_str()); \
+}
+
+#define PREFIX "[H264]<Id: %lld> "
+
+using namespace Plugin;
+using namespace Plugin::AMD;
+using namespace Utility;
+
+Plugin::AMD::EncoderH264::EncoderH264(
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter,
+ bool useOpenCLSubmission, bool useOpenCLConversion,
+ ColorFormat colorFormat, ColorSpace colorSpace, bool fullRangeColor,
+ bool useAsyncQueue, size_t asyncQueueSize)
+ : Encoder(Codec::AVC,
+ videoAPI, videoAdapter,
+ useOpenCLSubmission, useOpenCLConversion,
+ colorFormat, colorSpace, fullRangeColor,
+ useAsyncQueue, asyncQueueSize) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = AMF_UNEXPECTED;
+
+ /// Full Range Color Stuff
+ static const wchar_t* fullColorParams[] = {
+ L"FullRangeColor",
+ L"NominalRange",
+ };
+ for (const wchar_t* par : fullColorParams) {
+ res = m_AMFConverter->SetProperty(par, m_FullColorRange);
+ res = m_AMFEncoder->SetProperty(par, m_FullColorRange);
+ if (res == AMF_OK)
+ break;
+ }
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ PREFIX "Failed to set encoder color range, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::EncoderH264::~EncoderH264() {
+ AMFTRACECALL;
+}
+
+// Properties - Initialization
+std::vector<Usage> Plugin::AMD::EncoderH264::CapsUsage() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_USAGE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<Usage> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::UsageFromAMFH264((AMF_VIDEO_ENCODER_USAGE_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetUsage(Usage v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_USAGE, Utility::UsageToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::UsageToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::Usage Plugin::AMD::EncoderH264::GetUsage() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_USAGE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::UsageFromAMFH264((AMF_VIDEO_ENCODER_USAGE_ENUM)e);
+}
+
+// Properties - Static
+std::vector<QualityPreset> Plugin::AMD::EncoderH264::CapsQualityPreset() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_QUALITY_PRESET, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<QualityPreset> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::QualityPresetFromAMFH264((AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetQualityPreset(QualityPreset v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QUALITY_PRESET,
+ Utility::QualityPresetToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::QualityPresetToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::QualityPreset Plugin::AMD::EncoderH264::GetQualityPreset() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QUALITY_PRESET, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::QualityPresetFromAMFH264((AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM)e);
+}
+
+std::vector<Profile> Plugin::AMD::EncoderH264::CapsProfile() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_PROFILE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<Profile> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::ProfileFromAMFH264((AMF_VIDEO_ENCODER_PROFILE_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetProfile(Profile v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PROFILE,
+ Utility::ProfileToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::ProfileToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::Profile Plugin::AMD::EncoderH264::GetProfile() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PROFILE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::ProfileFromAMFH264((AMF_VIDEO_ENCODER_PROFILE_ENUM)e);
+}
+
+std::vector<ProfileLevel> Plugin::AMD::EncoderH264::CapsProfileLevel() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_PROFILE_LEVEL, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<ProfileLevel> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back((ProfileLevel)enm->value);
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetProfileLevel(ProfileLevel v) {
+ AMFTRACECALL;;
+
+ if (v == ProfileLevel::Automatic)
+ v = Utility::H264ProfileLevel(m_Resolution, m_FrameRate);
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PROFILE_LEVEL,
+ (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, (int64_t)v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::ProfileLevel Plugin::AMD::EncoderH264::GetProfileLevel() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PROFILE_LEVEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (ProfileLevel)e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH264::CapsMaximumReferenceFrames() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_MAX_NUM_REFRAMES, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetMaximumReferenceFrames(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_NUM_REFRAMES, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH264::GetMaximumReferenceFrames() {
+ AMFTRACECALL;
+
+ uint64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_NUM_REFRAMES, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, __FUNCTION_NAME__ PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+std::pair<std::pair<uint32_t, uint32_t>, std::pair<uint32_t, uint32_t>> Plugin::AMD::EncoderH264::CapsResolution() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_FRAMESIZE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(
+ std::make_pair(var->minValue.sizeValue.width, var->maxValue.sizeValue.width),
+ std::make_pair(var->minValue.sizeValue.height, var->maxValue.sizeValue.height)
+ );
+}
+
+void Plugin::AMD::EncoderH264::SetResolution(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FRAMESIZE, ::AMFConstructSize(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ldx%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_Resolution.first = v.first;
+ m_Resolution.second = v.second;
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::GetResolution() {
+ AMFTRACECALL;
+
+ AMFSize e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FRAMESIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_Resolution.first = e.width;
+ m_Resolution.second = e.height;
+ return std::make_pair(e.width, e.height);
+}
+
+void Plugin::AMD::EncoderH264::SetAspectRatio(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_ASPECT_RATIO, ::AMFConstructRatio(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld:%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::GetAspectRatio() {
+ AMFTRACECALL;
+
+ AMFRatio e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_ASPECT_RATIO, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return std::make_pair(e.num, e.den);
+}
+
+void Plugin::AMD::EncoderH264::SetFrameRate(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FRAMERATE, ::AMFConstructRate(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld/%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_FrameRate = std::make_pair(v.first, v.second);
+ UpdateFrameRateValues();
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::GetFrameRate() {
+ AMFTRACECALL;
+
+ AMFRate e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FRAMERATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_FrameRate = std::make_pair(e.num, e.den);
+ UpdateFrameRateValues();
+ return m_FrameRate;
+}
+
+std::vector<CodingType> Plugin::AMD::EncoderH264::CapsCodingType() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_CABAC_ENABLE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<CodingType> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::CodingTypeFromAMFH264((AMF_VIDEO_ENCODER_CODING_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetCodingType(CodingType v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_CABAC_ENABLE, Utility::CodingTypeToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::CodingTypeToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::CodingType Plugin::AMD::EncoderH264::GetCodingType() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_CABAC_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::CodingTypeFromAMFH264((AMF_VIDEO_ENCODER_CODING_ENUM)e);
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::CapsMaximumLongTermReferenceFrames() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_MAX_LTR_FRAMES, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetMaximumLongTermReferenceFrames(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_LTR_FRAMES, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetMaximumLongTermReferenceFrames() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_LTR_FRAMES, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Properties - Dynamic
+std::vector<RateControlMethod> Plugin::AMD::EncoderH264::CapsRateControlMethod() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<RateControlMethod> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::RateControlMethodFromAMFH264((AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetRateControlMethod(RateControlMethod v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD, Utility::RateControlMethodToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::RateControlMethodToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::RateControlMethod Plugin::AMD::EncoderH264::GetRateControlMethod() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::RateControlMethodFromAMFH264((AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM)e);
+}
+
+std::vector<PrePassMode> Plugin::AMD::EncoderH264::CapsPrePassMode() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_RATE_CONTROL_PREANALYSIS_ENABLE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<PrePassMode> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::PrePassModeFromAMFH264((AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH264::SetPrePassMode(PrePassMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_PREANALYSIS_ENABLE, Utility::PrePassModeToAMFH264(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::PrePassModeToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::PrePassMode Plugin::AMD::EncoderH264::GetPrePassMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_PREANALYSIS_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::PrePassModeFromAMFH264((AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM)e);
+}
+
+void Plugin::AMD::EncoderH264::SetVarianceBasedAdaptiveQuantizationEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_ENABLE_VBAQ, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsVarianceBasedAdaptiveQuantizationEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_ENABLE_VBAQ, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetFrameSkippingEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_SKIP_FRAME_ENABLE, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsFrameSkippingEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_RATE_CONTROL_SKIP_FRAME_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetEnforceHRDEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_ENFORCE_HRD, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsEnforceHRDEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_ENFORCE_HRD, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetFillerDataEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_FILLER_DATA_ENABLE, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsFillerDataEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_FILLER_DATA_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetQPMinimum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MIN_QP, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetQPMinimum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MIN_QP, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetQPMaximum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_QP, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetQPMaximum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_QP, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH264::CapsTargetBitrate() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_TARGET_BITRATE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetTargetBitrate(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_TARGET_BITRATE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH264::GetTargetBitrate() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_TARGET_BITRATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH264::CapsPeakBitrate() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_PEAK_BITRATE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetPeakBitrate(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_PEAK_BITRATE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH264::GetPeakBitrate() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_PEAK_BITRATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetIFrameQP(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_I, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetIFrameQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_I, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetPFrameQP(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_P, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetPFrameQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_P, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetBFrameQP(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_B, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetBFrameQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_QP_B, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetMaximumAccessUnitSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MAX_AU_SIZE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetMaximumAccessUnitSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MAX_AU_SIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH264::CapsVBVBufferSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_VBV_BUFFER_SIZE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetVBVBufferSize(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_VBV_BUFFER_SIZE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH264::GetVBVBufferSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_VBV_BUFFER_SIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetVBVBufferInitialFullness(double v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_INITIAL_VBV_BUFFER_FULLNESS, (int64_t)(v * 64));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %lf (%d), error %ls (code %d)",
+ m_UniqueId, v, (uint8_t)(v * 64), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+float Plugin::AMD::EncoderH264::GetInitialVBVBufferFullness() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_INITIAL_VBV_BUFFER_FULLNESS, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (e / 64.0f);
+}
+
+// Properties - Picture Control
+void Plugin::AMD::EncoderH264::SetIDRPeriod(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_IDR_PERIOD, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_PeriodIDR = v;
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetIDRPeriod() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_IDR_PERIOD, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_PeriodIDR = (uint32_t)e;
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetHeaderInsertionSpacing(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEADER_INSERTION_SPACING, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetHeaderInsertionSpacing() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEADER_INSERTION_SPACING, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetGOPAlignmentEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"EnableGOPAlignment", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsGOPAlignmentEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"EnableGOPAlignment", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetDeblockingFilterEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_DE_BLOCKING_FILTER, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsDeblockingFilterEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_DE_BLOCKING_FILTER, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+uint8_t Plugin::AMD::EncoderH264::CapsBFramePattern() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_B_PIC_PATTERN, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return (uint8_t)var->maxValue.int64Value;
+}
+
+void Plugin::AMD::EncoderH264::SetBFramePattern(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_B_PIC_PATTERN, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_TimestampOffset = v;
+}
+
+uint8_t Plugin::AMD::EncoderH264::GetBFramePattern() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_PIC_PATTERN, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetBFrameDeltaQP(int8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_QP_B, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+int8_t Plugin::AMD::EncoderH264::GetBFrameDeltaQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_PIC_DELTA_QP, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (int8_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetBFrameReferenceEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_B_REFERENCE_ENABLE, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsBFrameReferenceEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_B_REFERENCE_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetBFrameReferenceDeltaQP(int8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_REF_B_PIC_DELTA_QP, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+int8_t Plugin::AMD::EncoderH264::GetBFrameReferenceDeltaQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_REF_B_PIC_DELTA_QP, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (int8_t)e;
+}
+
+// Properties - Motion Estimation
+void Plugin::AMD::EncoderH264::SetMotionEstimationQuarterPixelEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MOTION_HALF_PIXEL, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsMotionEstimationQuarterPixelEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MOTION_HALF_PIXEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetMotionEstimationHalfPixelEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_MOTION_QUARTERPIXEL, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::IsMotionEstimationHalfPixelEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_MOTION_QUARTERPIXEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+// Properties - Intra-Refresh
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::CapsIntraRefreshNumMBsPerSlot() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_INTRA_REFRESH_NUM_MBS_PER_SLOT, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetIntraRefreshNumMBsPerSlot(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_INTRA_REFRESH_NUM_MBS_PER_SLOT, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetIntraRefreshNumMBsPerSlot() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_INTRA_REFRESH_NUM_MBS_PER_SLOT, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetIntraRefreshNumOfStripes(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"IntraRefreshNumOfStripes", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetIntraRefreshNumOfStripes() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"IntraRefreshNumOfStripes", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Properties - Slicing
+void Plugin::AMD::EncoderH264::SetSliceMode(H264::SliceMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceMode", static_cast<int64_t>(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::H264::SliceMode Plugin::AMD::EncoderH264::GetSliceMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceMode", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return static_cast<H264::SliceMode>(e);
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::CapsSlicesPerFrame() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_SLICES_PER_FRAME, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetSlicesPerFrame(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_SLICES_PER_FRAME, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetSlicesPerFrame() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_SLICES_PER_FRAME, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH264::SetSliceControlMode(SliceControlMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlMode", static_cast<int64_t>(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::SliceControlMode Plugin::AMD::EncoderH264::GetSliceControlMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlMode", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return static_cast<SliceControlMode>(e);
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::CapsSliceControlSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"SliceControlSize", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetSliceControlSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlSize", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetSliceControlSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlSize", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH264::CapsMaximumSliceSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"MaxSliceSize", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH264::SetMaximumSliceSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"MaxSliceSize", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH264::GetMaximumSliceSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"MaxSliceSize", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Properties - Experimental
+void Plugin::AMD::EncoderH264::SetLowLatencyInternal(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"LowLatencyInternal", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::GetLowLatencyInternal() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"LowLatencyInternal", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH264::SetCommonLowLatencyInternal(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"CommonLowLatencyInternal", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH264::GetCommonLowLatencyInternal() {
+ AMFTRACECALL;
+
+ bool e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"CommonLowLatencyInternal", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, PREFIX "<" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+// Internal
+void Plugin::AMD::EncoderH264::PacketPriorityAndKeyframe(amf::AMFDataPtr& pData, struct encoder_packet* packet) {
+ AMFTRACECALL;
+ uint64_t pktType;
+ pData->GetProperty(AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE, &pktType);
+ switch ((AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_ENUM)pktType) {
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_IDR:
+ packet->keyframe = true;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_I:
+ packet->priority = 3;
+ break;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_P:
+ packet->priority = 2;
+ break;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_B:
+ packet->priority = 0;
+ break;
+ }
+}
+
+AMF_RESULT Plugin::AMD::EncoderH264::GetExtraDataInternal(amf::AMFVariant* p) {
+ AMFTRACECALL;
+ return m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_EXTRADATA, p);
+}
+
+std::string Plugin::AMD::EncoderH264::HandleTypeOverride(amf::AMFSurfacePtr & d, uint64_t index) {
+ AMF_VIDEO_ENCODER_PICTURE_TYPE_ENUM type = AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE;
+
+ if ((m_PeriodBFrame > 0) && ((index % m_PeriodBFrame) == 0)) {
+ type = AMF_VIDEO_ENCODER_PICTURE_TYPE_B;
+ }
+ if ((m_PeriodPFrame > 0) && ((index % m_PeriodPFrame) == 0)) {
+ type = AMF_VIDEO_ENCODER_PICTURE_TYPE_P;
+ }
+ if ((m_PeriodIFrame > 0) && ((index % m_PeriodIFrame) == 0)) {
+ type = AMF_VIDEO_ENCODER_PICTURE_TYPE_I;
+ }
+ if ((type != AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE) && (m_PeriodIDR > 0) && ((index % m_PeriodIDR) == 0)) {
+ type = AMF_VIDEO_ENCODER_PICTURE_TYPE_IDR;
+ }
+ if (m_FrameSkipPeriod > 0) {
+ bool shouldSkip = m_FrameSkipKeepOnlyNth
+ ? (index % m_FrameSkipPeriod) != 0
+ : (index % m_FrameSkipPeriod) == 0;
+
+ if (shouldSkip) {
+ if ((m_FrameSkipType <= AMF_VIDEO_ENCODER_PICTURE_TYPE_SKIP) || (type < m_FrameSkipType))
+ m_FrameSkipType = type;
+ type = AMF_VIDEO_ENCODER_PICTURE_TYPE_SKIP;
+ } else if (m_FrameSkipType != AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE) {
+ type = m_FrameSkipType; // Hopefully fixes the crash.
+ m_FrameSkipType = AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE;
+ }
+ }
+ d->SetProperty(AMF_VIDEO_ENCODER_FORCE_PICTURE_TYPE, type);
+
+ switch (type) {
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE:
+ return "Automatic";
+ break;
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_SKIP:
+ return "Skip";
+ break;
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_IDR:
+ return "IDR";
+ break;
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_I:
+ return "I";
+ break;
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_P:
+ return "P";
+ break;
+ case AMF_VIDEO_ENCODER_PICTURE_TYPE_B:
+ return "B";
+ break;
+ }
+ return "Unknown";
+}
+
+void Plugin::AMD::EncoderH264::LogProperties() {
+ AMFTRACECALL;
+
+ PLOG_INFO(PREFIX "Encoder Parameters:",
+ m_UniqueId);
+ #pragma region Backend
+ PLOG_INFO(PREFIX " Backend:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Video API: %s",
+ m_UniqueId,
+ m_API->GetName().c_str());
+ PLOG_INFO(PREFIX " Video Adapter: %s",
+ m_UniqueId,
+ m_APIAdapter.Name.c_str());
+ PLOG_INFO(PREFIX " OpenCL: %s",
+ m_UniqueId,
+ m_OpenCL ? "Supported" : "Not Supported");
+ PLOG_INFO(PREFIX " Transfer: %s",
+ m_UniqueId,
+ m_OpenCLSubmission ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Conversion: %s",
+ m_UniqueId,
+ m_OpenCLConversion ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Async Queue: %s",
+ m_UniqueId,
+ m_AsyncQueue ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Size: %" PRIu32,
+ m_UniqueId,
+ (uint32_t)m_AsyncQueueSize);
+ #pragma endregion Backend
+ #pragma region Frame
+ PLOG_INFO(PREFIX " Frame:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Format: %s %s %s",
+ m_UniqueId,
+ Utility::ColorFormatToString(m_ColorFormat),
+ Utility::ColorSpaceToString(m_ColorSpace),
+ m_FullColorRange ? "Full" : "Partial");
+ PLOG_INFO(PREFIX " Resolution: %" PRIu32 "x%" PRIu32,
+ m_UniqueId,
+ m_Resolution.first,
+ m_Resolution.second);
+ PLOG_INFO(PREFIX " Frame Rate: %" PRIu32 "/%" PRIu32,
+ m_UniqueId,
+ m_FrameRate.first,
+ m_FrameRate.second);
+ auto aspectRatio = GetAspectRatio();
+ PLOG_INFO(PREFIX " Aspect Ratio: %" PRIu32 ":%" PRIu32,
+ m_UniqueId,
+ aspectRatio.first,
+ aspectRatio.second);
+ #pragma endregion Frame
+ #pragma region Static
+ PLOG_INFO(PREFIX " Static:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Usage: %s",
+ m_UniqueId,
+ Utility::UsageToString(GetUsage()));
+ PLOG_INFO(PREFIX " Quality Preset: %s",
+ m_UniqueId,
+ Utility::QualityPresetToString(GetQualityPreset()));
+ auto profileLevel = static_cast<uint16_t>(GetProfileLevel());
+ PLOG_INFO(PREFIX " Profile: %s %" PRIu16 ".%" PRIu16,
+ m_UniqueId,
+ Utility::ProfileToString(GetProfile()),
+ profileLevel / 10,
+ profileLevel % 10);
+ PLOG_INFO(PREFIX " Coding Type: %s",
+ m_UniqueId,
+ Utility::CodingTypeToString(GetCodingType()));
+ PLOG_INFO(PREFIX " Max. Reference Frames: %" PRIu16,
+ m_UniqueId,
+ (uint16_t)GetMaximumReferenceFrames());
+ PLOG_INFO(PREFIX " Max. Long-Term Reference Frames: %" PRIu16,
+ m_UniqueId,
+ (uint16_t)GetMaximumLongTermReferenceFrames());
+ #pragma endregion Static
+ #pragma region Rate Control
+ PLOG_INFO(PREFIX " Rate Control:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Method: %s",
+ m_UniqueId,
+ Utility::RateControlMethodToString(GetRateControlMethod()));
+ PLOG_INFO(PREFIX " Pre-Pass Mode: %s",
+ m_UniqueId,
+ Utility::PrePassModeToString(GetPrePassMode()));
+ #pragma region QP
+ PLOG_INFO(PREFIX " QP:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Range: %" PRIu8 " - %" PRIu8,
+ m_UniqueId,
+ GetQPMinimum(),
+ GetQPMaximum());
+ PLOG_INFO(PREFIX " I-Frame: %" PRIu8,
+ m_UniqueId,
+ GetIFrameQP());
+ PLOG_INFO(PREFIX " P-Frame: %" PRIu8,
+ m_UniqueId,
+ GetPFrameQP());
+ try {
+ PLOG_INFO(PREFIX " B-Frame: %" PRIu8,
+ m_UniqueId,
+ GetBFrameQP());
+ } catch (...) {
+ PLOG_INFO(PREFIX " B-Frame: N/A",
+ m_UniqueId);
+ }
+ #pragma endregion QP
+ #pragma region Bitrate
+ PLOG_INFO(PREFIX " Bitrate:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Target: %" PRIu64 " bit/s",
+ m_UniqueId,
+ GetTargetBitrate());
+ PLOG_INFO(PREFIX " Peak: %" PRIu64 " bit/s",
+ m_UniqueId,
+ GetPeakBitrate());
+ #pragma endregion Bitrate
+ #pragma region Flags
+ PLOG_INFO(PREFIX " Flags:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Filler Data: %s",
+ m_UniqueId,
+ IsFillerDataEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Frame Skipping: %s",
+ m_UniqueId,
+ IsFrameSkippingEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Period: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetFrameSkippingPeriod());
+ PLOG_INFO(PREFIX " Behaviour: %s",
+ m_UniqueId,
+ GetFrameSkippingBehaviour() ? "Keep every Nth frame" : "Skip every Nth frame");
+ PLOG_INFO(PREFIX " Variance Based Adaptive Quantization: %s",
+ m_UniqueId,
+ IsVarianceBasedAdaptiveQuantizationEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Enforce Hypothetical Reference Decoder: %s",
+ m_UniqueId,
+ IsEnforceHRDEnabled() ? "Enabled" : "Disabled");
+ #pragma endregion Flags
+ #pragma region Video Buffering Verifier
+ PLOG_INFO(PREFIX " Video Buffering Verfier:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Buffer Size: %" PRIu64 " bits",
+ m_UniqueId,
+ GetVBVBufferSize());
+ PLOG_INFO(PREFIX " Initial Fullness: %" PRIu64 " %%",
+ m_UniqueId,
+ (uint64_t)round(GetInitialVBVBufferFullness() * 100.0));
+ #pragma endregion Video Buffering Verifier
+ PLOG_INFO(PREFIX " Max. Access Unit Size: %" PRIu32,
+ m_UniqueId,
+ GetMaximumAccessUnitSize());
+ #pragma endregion Rate Control
+
+ #pragma region Picture Control
+ PLOG_INFO(PREFIX " Picture Control:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Period:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " IDR: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetIDRPeriod());
+ PLOG_INFO(PREFIX " I: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetIFramePeriod());
+ PLOG_INFO(PREFIX " P: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetPFramePeriod());
+ PLOG_INFO(PREFIX " B: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetBFramePeriod());
+ PLOG_INFO(PREFIX " Header Insertion Spacing: %" PRIu32,
+ m_UniqueId,
+ GetHeaderInsertionSpacing());
+ PLOG_INFO(PREFIX " GOP Alignment: %s",
+ m_UniqueId,
+ IsGOPAlignmentEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Deblocking Filter: %s",
+ m_UniqueId,
+ IsDeblockingFilterEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Motion Estimation: %s%s",
+ m_UniqueId,
+ IsMotionEstimationQuarterPixelEnabled() ? (IsMotionEstimationHalfPixelEnabled() ? "Quarter, " : "Quarter") : "",
+ IsMotionEstimationHalfPixelEnabled() ? "Half" : "");
+ PLOG_INFO(PREFIX " B-Frames:",
+ m_UniqueId);
+ try {
+ PLOG_INFO(PREFIX " Pattern: %" PRIu8,
+ m_UniqueId,
+ GetBFramePattern());
+ } catch (...) {
+ PLOG_INFO(PREFIX " Pattern: N/A",
+ m_UniqueId);
+ }
+ try {
+ PLOG_INFO(PREFIX " Delta QP: %" PRIi8,
+ m_UniqueId,
+ GetBFrameDeltaQP());
+ } catch (...) {
+ PLOG_INFO(PREFIX " Delta QP: N/A",
+ m_UniqueId);
+ }
+ try {
+ PLOG_INFO(PREFIX " Reference: %s",
+ m_UniqueId,
+ IsBFrameReferenceEnabled() ? "Enabled" : "Disabled");
+ } catch (...) {
+ PLOG_INFO(PREFIX " Reference: N/A",
+ m_UniqueId);
+ }
+ try {
+ PLOG_INFO(PREFIX " Reference Delta QP: %" PRIi8,
+ m_UniqueId,
+ GetBFrameReferenceDeltaQP());
+ } catch (...) {
+ PLOG_INFO(PREFIX " Reference Delta QP: N/A",
+ m_UniqueId);
+ }
+ #pragma endregion Picture Control
+
+ #pragma region Intra-Refresh
+ PLOG_INFO(PREFIX " Intra-Refresh:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Number of Macroblocks Per Slot: %" PRIu32,
+ m_UniqueId,
+ GetIntraRefreshNumMBsPerSlot());
+ PLOG_INFO(PREFIX " Number of Stripes: %" PRIu32,
+ m_UniqueId,
+ GetIntraRefreshNumOfStripes());
+ #pragma endregion Intra-Refresh
+
+ #pragma region Slicing
+ PLOG_INFO(PREFIX " Slicing:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Mode: %" PRIu64,
+ m_UniqueId,
+ Utility::SliceModeToString(GetSliceMode()));
+ PLOG_INFO(PREFIX " Slices Per Frame: %" PRIu32,
+ m_UniqueId,
+ GetSlicesPerFrame());
+ PLOG_INFO(PREFIX " Control Mode: %s",
+ m_UniqueId,
+ Utility::SliceControlModeToString(GetSliceControlMode()));
+ PLOG_INFO(PREFIX " Control Size: %" PRIu32,
+ m_UniqueId,
+ GetSliceControlSize());
+ PLOG_INFO(PREFIX " Maximum Slice Size: %" PRIu32,
+ m_UniqueId,
+ GetMaximumSliceSize());
+ #pragma endregion Slicing
+
+ #pragma region Experimental
+ PLOG_INFO(PREFIX " Experimental:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Low Latency: %s",
+ m_UniqueId,
+ GetLowLatencyInternal() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Ultra Low Latency: %s",
+ m_UniqueId,
+ GetCommonLowLatencyInternal() ? "Enabled" : "Disabled");
+ #pragma endregion Experimental
+
+ //PLOG_INFO(PREFIX " ");
+ //PLOG_INFO(PREFIX " ");
+}
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/amf-encoder-h265.cpp
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#include "amf-encoder-h265.h"
+#include "utility.h"
+#include <inttypes.h>
+
+#define QUICK_THROW_ERROR(format, ...) {\
+ QUICK_FORMAT_MESSAGE(errMsg, __FUNCTION_NAME__ format, \
+ m_UniqueId, ##__VA_ARGS__, \
+ m_AMF->GetTrace()->GetResultText(res), res); \
+ throw std::exception(errMsg.c_str()); \
+}
+#define PREFIX "[H265]<Id: %lld> "
+
+using namespace Plugin;
+using namespace Plugin::AMD;
+using namespace Utility;
+
+Plugin::AMD::EncoderH265::EncoderH265(
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter,
+ bool useOpenCLSubmission, bool useOpenCLConversion,
+ ColorFormat colorFormat, ColorSpace colorSpace, bool fullRangeColor,
+ bool useAsyncQueue, size_t asyncQueueSize)
+ : Encoder(Codec::HEVC,
+ videoAPI, videoAdapter,
+ useOpenCLSubmission, useOpenCLConversion,
+ colorFormat, colorSpace, fullRangeColor,
+ useAsyncQueue, asyncQueueSize) {
+ AMFTRACECALL;
+}
+
+
+Plugin::AMD::EncoderH265::~EncoderH265() {
+ AMFTRACECALL;
+}
+
+// Initialization
+std::vector<Usage> Plugin::AMD::EncoderH265::CapsUsage() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_USAGE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<Usage> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::UsageFromAMFH265((AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetUsage(Usage v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_USAGE, Utility::UsageToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::UsageToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::Usage Plugin::AMD::EncoderH265::GetUsage() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_USAGE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::UsageFromAMFH265((AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM)e);
+}
+
+// Static
+std::vector<QualityPreset> Plugin::AMD::EncoderH265::CapsQualityPreset() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<QualityPreset> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::QualityPresetFromAMFH265((AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetQualityPreset(QualityPreset v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET, Utility::QualityPresetToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::QualityPresetToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::QualityPreset Plugin::AMD::EncoderH265::GetQualityPreset() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::QualityPresetFromAMFH265((AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM)e);
+}
+
+std::pair<std::pair<uint32_t, uint32_t>, std::pair<uint32_t, uint32_t>> Plugin::AMD::EncoderH265::CapsResolution() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_FRAMESIZE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(
+ std::make_pair(var->minValue.sizeValue.width, var->maxValue.sizeValue.width),
+ std::make_pair(var->minValue.sizeValue.height, var->maxValue.sizeValue.height)
+ );
+}
+
+void Plugin::AMD::EncoderH265::SetResolution(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_FRAMESIZE, ::AMFConstructSize(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ldx%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_Resolution.first = v.first;
+ m_Resolution.second = v.second;
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::GetResolution() {
+ AMFTRACECALL;
+
+ AMFSize e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_FRAMESIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_Resolution.first = e.width;
+ m_Resolution.second = e.height;
+ return std::make_pair(e.width, e.height);
+}
+
+void Plugin::AMD::EncoderH265::SetAspectRatio(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_ASPECT_RATIO, ::AMFConstructRatio(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld:%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::GetAspectRatio() {
+ AMFTRACECALL;
+
+ AMFRatio e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_ASPECT_RATIO, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return std::make_pair(e.num, e.den);
+}
+
+void Plugin::AMD::EncoderH265::SetFrameRate(std::pair<uint32_t, uint32_t> v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_FRAMERATE, ::AMFConstructRate(v.first, v.second));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld/%ld, error %ls (code %d)",
+ m_UniqueId, v.first, v.second, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_FrameRate = std::make_pair(v.first, v.second);
+ UpdateFrameRateValues();
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::GetFrameRate() {
+ AMFTRACECALL;
+
+ AMFRate e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_FRAMERATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_FrameRate = std::make_pair(e.num, e.den);
+ UpdateFrameRateValues();
+ return m_FrameRate;
+}
+
+std::vector<Profile> Plugin::AMD::EncoderH265::CapsProfile() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_PROFILE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<Profile> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::ProfileFromAMFH265((AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetProfile(Profile v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_PROFILE, Utility::ProfileToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::ProfileToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::Profile Plugin::AMD::EncoderH265::GetProfile() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_PROFILE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::ProfileFromAMFH265((AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM)e);
+}
+
+std::vector<ProfileLevel> Plugin::AMD::EncoderH265::CapsProfileLevel() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_PROFILE_LEVEL, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<ProfileLevel> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back((ProfileLevel)(enm->value / 3));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetProfileLevel(ProfileLevel v) {
+ AMFTRACECALL;;
+
+ if (v == ProfileLevel::Automatic)
+ v = Utility::H265ProfileLevel(m_Resolution, m_FrameRate);
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_PROFILE_LEVEL, ((int64_t)v) * 3);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, (int64_t)v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::ProfileLevel Plugin::AMD::EncoderH265::GetProfileLevel() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_PROFILE_LEVEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (ProfileLevel)(e / 3);
+}
+
+std::vector<H265::Tier> Plugin::AMD::EncoderH265::CapsTier() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_TIER, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<H265::Tier> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::TierFromAMFH265((AMF_VIDEO_ENCODER_HEVC_TIER_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetTier(H265::Tier v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_TIER, Utility::TierToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::TierToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::H265::Tier Plugin::AMD::EncoderH265::GetTier() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_TIER, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (H265::Tier)e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH265::CapsMaximumReferenceFrames() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_MAX_NUM_REFRAMES, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetMaximumReferenceFrames(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_NUM_REFRAMES, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH265::GetMaximumReferenceFrames() {
+ AMFTRACECALL;
+
+ uint64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_NUM_REFRAMES, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, __FUNCTION_NAME__ "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+std::vector<CodingType> Plugin::AMD::EncoderH265::CapsCodingType() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_CABAC_ENABLE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<CodingType> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::CodingTypeFromAMFH265((AMF_VIDEO_ENCODER_CODING_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetCodingType(CodingType v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_CABAC_ENABLE, Utility::CodingTypeToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::CodingTypeToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::CodingType Plugin::AMD::EncoderH265::GetCodingType() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_CABAC_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::CodingTypeFromAMFH265((AMF_VIDEO_ENCODER_CODING_ENUM)e);
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::CapsMaximumLongTermReferenceFrames() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_MAX_LTR_FRAMES, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetMaximumLongTermReferenceFrames(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_LTR_FRAMES, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetMaximumLongTermReferenceFrames() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_LTR_FRAMES, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+/// Rate Control
+std::vector<RateControlMethod> Plugin::AMD::EncoderH265::CapsRateControlMethod() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<RateControlMethod> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::RateControlMethodFromAMFH265((AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM)enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetRateControlMethod(RateControlMethod v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD, Utility::RateControlMethodToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, Utility::RateControlMethodToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::RateControlMethod Plugin::AMD::EncoderH265::GetRateControlMethod() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::RateControlMethodFromAMFH265((AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM)e);
+}
+
+std::vector<PrePassMode> Plugin::AMD::EncoderH265::CapsPrePassMode() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_PREANALYSIS_ENABLE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ if (var->type == amf::AMF_VARIANT_BOOL) {
+ return std::vector<PrePassMode>({ PrePassMode::Disabled, PrePassMode::Enabled });
+ } else {
+ std::vector<PrePassMode> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ PLOG_ERROR("Unknown Pre-Pass Mode: %ls %lld", enm->name, enm->value);
+ //ret.push_back(Utility::PrePassModeFromAMFH265((AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM)enm->value));
+ }
+ return ret;
+ }
+}
+
+void Plugin::AMD::EncoderH265::SetPrePassMode(PrePassMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_PREANALYSIS_ENABLE, v != PrePassMode::Disabled);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, (v != PrePassMode::Disabled) ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::PrePassMode Plugin::AMD::EncoderH265::GetPrePassMode() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_PREANALYSIS_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> Unable to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ if (e) {
+ return PrePassMode::Enabled;
+ } else {
+ return PrePassMode::Disabled;
+ }
+}
+
+void Plugin::AMD::EncoderH265::SetVarianceBasedAdaptiveQuantizationEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_ENABLE_VBAQ, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsVarianceBasedAdaptiveQuantizationEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_ENABLE_VBAQ, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+/// VBV Buffer
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH265::CapsVBVBufferSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_VBV_BUFFER_SIZE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetVBVBufferSize(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_VBV_BUFFER_SIZE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH265::GetVBVBufferSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_VBV_BUFFER_SIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetVBVBufferInitialFullness(double v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_INITIAL_VBV_BUFFER_FULLNESS, (int64_t)(v * 64));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lf (%d), error %ls (code %d)",
+ m_UniqueId, v, (uint8_t)(v * 64), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+float Plugin::AMD::EncoderH265::GetInitialVBVBufferFullness() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_INITIAL_VBV_BUFFER_FULLNESS, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (e / 64.0f);
+}
+
+/// Picture Control
+std::vector<H265::GOPType> Plugin::AMD::EncoderH265::CapsGOPType() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"GOPType", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ std::vector<H265::GOPType> ret;
+ for (const amf::AMFEnumDescriptionEntry* enm = var->pEnumDescription; enm->name != nullptr; enm++) {
+ ret.push_back(Utility::GOPTypeFromAMFH265(enm->value));
+ }
+ return ret;
+}
+
+void Plugin::AMD::EncoderH265::SetGOPType(H265::GOPType v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"GOPType", Utility::GOPTypeToAMFH265(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, Utility::GOPTypeToString(v), m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::H265::GOPType Plugin::AMD::EncoderH265::GetGOPType() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"GOPType", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return Utility::GOPTypeFromAMFH265(e);
+}
+
+void Plugin::AMD::EncoderH265::SetGOPSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_GOP_SIZE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetGOPSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_GOP_SIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetGOPSizeMin(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"GOPSizeMin", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetGOPSizeMin() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"GOPSizeMin", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetGOPSizeMax(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"GOPSizeMax", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetGOPSizeMax() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"GOPSizeMax", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetGOPAlignmentEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"EnableGOPAlignment", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsGOPAlignmentEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"EnableGOPAlignment", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetIDRPeriod(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_NUM_GOPS_PER_IDR, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_PeriodIDR = v;
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetIDRPeriod() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_NUM_GOPS_PER_IDR, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ m_PeriodIDR = (uint32_t)e;
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetHeaderInsertionMode(H265::HeaderInsertionMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_HEADER_INSERTION_MODE, static_cast<AMF_VIDEO_ENCODER_HEVC_HEADER_INSERTION_MODE_ENUM>(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::H265::HeaderInsertionMode Plugin::AMD::EncoderH265::GetHeaderInsertionMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_HEADER_INSERTION_MODE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return static_cast<H265::HeaderInsertionMode>(e);
+}
+
+void Plugin::AMD::EncoderH265::SetDeblockingFilterEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_DE_BLOCKING_FILTER_DISABLE, !v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsDeblockingFilterEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"HevcDeBlockingFilter", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+/// Motion Estimation
+void Plugin::AMD::EncoderH265::SetMotionEstimationQuarterPixelEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MOTION_QUARTERPIXEL, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsMotionEstimationQuarterPixelEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MOTION_QUARTERPIXEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetMotionEstimationHalfPixelEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MOTION_HALF_PIXEL, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsMotionEstimationHalfPixelEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MOTION_HALF_PIXEL, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+// Dynamic
+void Plugin::AMD::EncoderH265::SetFrameSkippingEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_SKIP_FRAME_ENABLE, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsFrameSkippingEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_SKIP_FRAME_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetEnforceHRDEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_ENFORCE_HRD, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsEnforceHRDEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_ENFORCE_HRD, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetFillerDataEnabled(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_FILLER_DATA_ENABLE, v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::IsFillerDataEnabled() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_FILLER_DATA_ENABLE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetIFrameQPMinimum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MIN_QP_I, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetIFrameQPMinimum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MIN_QP_I, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetIFrameQPMaximum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_QP_I, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetIFrameQPMaximum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_QP_I, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetPFrameQPMinimum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MIN_QP_P, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetPFrameQPMinimum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MIN_QP_P, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetPFrameQPMaximum(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_QP_P, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetPFrameQPMaximum() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_QP_P, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH265::CapsTargetBitrate() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_TARGET_BITRATE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetTargetBitrate(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_TARGET_BITRATE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH265::GetTargetBitrate() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_TARGET_BITRATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+std::pair<uint64_t, uint64_t> Plugin::AMD::EncoderH265::CapsPeakBitrate() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(AMF_VIDEO_ENCODER_HEVC_PEAK_BITRATE, &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair(var->minValue.int64Value, var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetPeakBitrate(uint64_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_PEAK_BITRATE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %lld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint64_t Plugin::AMD::EncoderH265::GetPeakBitrate() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_PEAK_BITRATE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetIFrameQP(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_QP_I, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetIFrameQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_QP_I, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetPFrameQP(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_QP_P, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %d, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetPFrameQP() {
+ AMFTRACECALL;
+
+ int64_t e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_QP_P, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetMaximumAccessUnitSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_AU_SIZE, (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetMaximumAccessUnitSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_MAX_AU_SIZE, &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Intra-Refresh
+void Plugin::AMD::EncoderH265::SetIntraRefreshMode(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"IntraRefreshMode", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetIntraRefreshMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"IntraRefreshMode", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetIntraRefreshFrameNum(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"HevcIntraRefreshFrameNum", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetIntraRefreshFrameNum() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"HevcIntraRefreshFrameNum", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Properties - Slicing
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::CapsSlicesPerFrame() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"HevcSlicesPerFrame", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetSlicesPerFrame(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"HevcSlicesPerFrame", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetSlicesPerFrame() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"HevcSlicesPerFrame", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetSliceControlMode(SliceControlMode v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlMode", static_cast<int64_t>(v));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+Plugin::AMD::SliceControlMode Plugin::AMD::EncoderH265::GetSliceControlMode() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlMode", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return static_cast<SliceControlMode>(e);
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::CapsSliceControlSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"SliceControlSize", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetSliceControlSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"SliceControlSize", (int64_t)v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetSliceControlSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"SliceControlSize", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+// Experimental
+void Plugin::AMD::EncoderH265::SetQPCBOffset(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"QPCBOFFSET", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetQPCBOffset() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"QPCBOFFSET", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetQPCROffset(uint8_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"QPCROFFSET", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint8_t Plugin::AMD::EncoderH265::GetQPCROffset() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"QPCROFFSET", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint8_t)e;
+}
+
+std::pair<uint32_t, uint32_t> Plugin::AMD::EncoderH265::CapsInputQueueSize() {
+ AMFTRACECALL;
+
+ const amf::AMFPropertyInfo* var;
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(L"HevcInputQueueSize", &var);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Querying capabilities failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ return std::make_pair((uint32_t)var->minValue.int64Value, (uint32_t)var->maxValue.int64Value);
+}
+
+void Plugin::AMD::EncoderH265::SetInputQueueSize(uint32_t v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"HevcInputQueueSize", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %ld, error %ls (code %d)",
+ m_UniqueId, v, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+uint32_t Plugin::AMD::EncoderH265::GetInputQueueSize() {
+ AMFTRACECALL;
+
+ int64_t e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"HevcInputQueueSize", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return (uint32_t)e;
+}
+
+void Plugin::AMD::EncoderH265::SetLowLatencyInternal(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"LowLatencyInternal", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::GetLowLatencyInternal() {
+ AMFTRACECALL;
+
+ bool e;
+
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"LowLatencyInternal", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+void Plugin::AMD::EncoderH265::SetCommonLowLatencyInternal(bool v) {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->SetProperty(L"CommonLowLatencyInternal", v);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to set mode to %s, error %ls (code %d)",
+ m_UniqueId, v ? "Enabled" : "Disabled", m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+bool Plugin::AMD::EncoderH265::GetCommonLowLatencyInternal() {
+ AMFTRACECALL;
+
+ bool e;
+ AMF_RESULT res = m_AMFEncoder->GetProperty(L"CommonLowLatencyInternal", &e);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg, "<Id: %lld> <" __FUNCTION_NAME__ "> Failed to retrieve value, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ return e;
+}
+
+// Internal
+void Plugin::AMD::EncoderH265::PacketPriorityAndKeyframe(amf::AMFDataPtr& pData, struct encoder_packet* packet) {
+ AMFTRACECALL;
+
+ uint64_t pktType;
+ pData->GetProperty(AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE, &pktType);
+ switch ((AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_ENUM)pktType) {
+ case AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_I:
+ packet->keyframe = true;
+ packet->priority = 3;
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_P:
+ packet->priority = 1;
+ break;
+ }
+}
+
+AMF_RESULT Plugin::AMD::EncoderH265::GetExtraDataInternal(amf::AMFVariant* p) {
+ AMFTRACECALL;
+
+ return m_AMFEncoder->GetProperty(AMF_VIDEO_ENCODER_HEVC_EXTRADATA, p);
+}
+
+std::string Plugin::AMD::EncoderH265::HandleTypeOverride(amf::AMFSurfacePtr & d, uint64_t index) {
+ AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_ENUM type = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_NONE;
+
+ if ((m_PeriodPFrame > 0) && ((index % m_PeriodPFrame) == 0)) {
+ type = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_P;
+ }
+ if ((m_PeriodIFrame > 0) && ((index % m_PeriodIFrame) == 0)) {
+ type = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_I;
+ }
+ uint64_t realIPeriod = m_PeriodIDR * GetGOPSize();
+ if ((type != AMF_VIDEO_ENCODER_PICTURE_TYPE_NONE) && (realIPeriod > 0) && ((index % realIPeriod) == 0)) {
+ type = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_IDR;
+ }
+ if (m_FrameSkipPeriod > 0) {
+ bool shouldSkip = m_FrameSkipKeepOnlyNth
+ ? (index % m_FrameSkipPeriod) != 0
+ : (index % m_FrameSkipPeriod) == 0;
+
+ if (shouldSkip) {
+ if ((m_FrameSkipType <= AMF_VIDEO_ENCODER_PICTURE_TYPE_SKIP) || (type < m_FrameSkipType))
+ m_FrameSkipType = type;
+ type = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_SKIP;
+ } else if (m_FrameSkipType != AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_NONE) {
+ type = m_FrameSkipType; // Hopefully fixes the crash.
+ m_FrameSkipType = AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_NONE;
+ }
+ }
+ d->SetProperty(AMF_VIDEO_ENCODER_FORCE_PICTURE_TYPE, type);
+
+ switch (type) {
+ case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_NONE:
+ return "Automatic";
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_SKIP:
+ return "Skip";
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_IDR:
+ return "IDR";
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_I:
+ return "I";
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_P:
+ return "P";
+ break;
+ //case AMF_VIDEO_ENCODER_HEVC_PICTURE_TYPE_B:
+ // return "B";
+ // break;
+ }
+ return "Unknown";
+}
+
+void Plugin::AMD::EncoderH265::LogProperties() {
+ AMFTRACECALL;
+
+ PLOG_INFO(PREFIX "Encoder Parameters:",
+ m_UniqueId);
+ #pragma region Backend
+ PLOG_INFO(PREFIX " Backend:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Video API: %s",
+ m_UniqueId,
+ m_API->GetName().c_str());
+ PLOG_INFO(PREFIX " Video Adapter: %s",
+ m_UniqueId,
+ m_APIAdapter.Name.c_str());
+ PLOG_INFO(PREFIX " OpenCL: %s",
+ m_UniqueId,
+ m_OpenCL ? "Supported" : "Not Supported");
+ PLOG_INFO(PREFIX " Transfer: %s",
+ m_UniqueId,
+ m_OpenCLSubmission ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Conversion: %s",
+ m_UniqueId,
+ m_OpenCLConversion ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Async Queue: %s",
+ m_UniqueId,
+ m_AsyncQueue ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Size: %" PRIu32,
+ m_UniqueId,
+ (uint32_t)m_AsyncQueueSize);
+ #pragma endregion Backend
+ #pragma region Frame
+ PLOG_INFO(PREFIX " Frame:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Format: %s %s %s",
+ m_UniqueId,
+ Utility::ColorFormatToString(m_ColorFormat),
+ Utility::ColorSpaceToString(m_ColorSpace),
+ m_FullColorRange ? "Full" : "Partial");
+ PLOG_INFO(PREFIX " Resolution: %" PRIu32 "x%" PRIu32,
+ m_UniqueId,
+ m_Resolution.first,
+ m_Resolution.second);
+ PLOG_INFO(PREFIX " Frame Rate: %" PRIu32 "/%" PRIu32,
+ m_UniqueId,
+ m_FrameRate.first,
+ m_FrameRate.second);
+ auto aspectRatio = GetAspectRatio();
+ PLOG_INFO(PREFIX " Aspect Ratio: %" PRIu32 ":%" PRIu32,
+ m_UniqueId,
+ aspectRatio.first,
+ aspectRatio.second);
+ #pragma endregion Frame
+ #pragma region Static
+ PLOG_INFO(PREFIX " Static:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Usage: %s",
+ m_UniqueId,
+ Utility::UsageToString(GetUsage()));
+ PLOG_INFO(PREFIX " Quality Preset: %s",
+ m_UniqueId,
+ Utility::QualityPresetToString(GetQualityPreset()));
+ auto profileLevel = static_cast<uint16_t>(GetProfileLevel());
+ PLOG_INFO(PREFIX " Profile: %s %" PRIu16 ".%" PRIu16,
+ m_UniqueId,
+ Utility::ProfileToString(GetProfile()),
+ profileLevel / 10,
+ profileLevel % 10);
+ PLOG_INFO(PREFIX " Tier: %s",
+ m_UniqueId,
+ Utility::TierToString(GetTier()));
+ PLOG_INFO(PREFIX " Coding Type: %s",
+ m_UniqueId,
+ Utility::CodingTypeToString(GetCodingType()));
+ PLOG_INFO(PREFIX " Max. Reference Frames: %" PRIu16,
+ m_UniqueId,
+ (uint16_t)GetMaximumReferenceFrames());
+ PLOG_INFO(PREFIX " Max. Long-Term Reference Frames: %" PRIu16,
+ m_UniqueId,
+ (uint16_t)GetMaximumLongTermReferenceFrames());
+ #pragma endregion Static
+ #pragma region Rate Control
+ PLOG_INFO(PREFIX " Rate Control:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Method: %s",
+ m_UniqueId,
+ Utility::RateControlMethodToString(GetRateControlMethod()));
+ PLOG_INFO(PREFIX " Pre-Pass Mode: %s",
+ m_UniqueId,
+ Utility::PrePassModeToString(GetPrePassMode()));
+ #pragma region QP
+ PLOG_INFO(PREFIX " QP:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Ranges:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " I-Frame: %" PRIu8 " - %" PRIu8,
+ m_UniqueId,
+ GetIFrameQPMinimum(),
+ GetIFrameQPMaximum());
+ PLOG_INFO(PREFIX " P-Frame: %" PRIu8 " - %" PRIu8,
+ m_UniqueId,
+ GetPFrameQPMinimum(),
+ GetPFrameQPMaximum());
+ PLOG_INFO(PREFIX " Fixed:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " I-Frame: %" PRIu8,
+ m_UniqueId,
+ GetIFrameQP());
+ PLOG_INFO(PREFIX " P-Frame: %" PRIu8,
+ m_UniqueId,
+ GetPFrameQP());
+ #pragma endregion QP
+ #pragma region Bitrate
+ PLOG_INFO(PREFIX " Bitrate:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Target: %" PRIu64 " bit/s",
+ m_UniqueId,
+ GetTargetBitrate());
+ PLOG_INFO(PREFIX " Peak: %" PRIu64 " bit/s",
+ m_UniqueId,
+ GetPeakBitrate());
+ #pragma endregion Bitrate
+ #pragma region Flags
+ PLOG_INFO(PREFIX " Flags:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Filler Data: %s",
+ m_UniqueId,
+ IsFillerDataEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Frame Skipping: %s",
+ m_UniqueId,
+ IsFrameSkippingEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Period: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetFrameSkippingPeriod());
+ PLOG_INFO(PREFIX " Behaviour: %s",
+ m_UniqueId,
+ GetFrameSkippingBehaviour() ? "Keep every Nth frame" : "Skip every Nth frame");
+ PLOG_INFO(PREFIX " Variance Based Adaptive Quantization: %s",
+ m_UniqueId,
+ IsVarianceBasedAdaptiveQuantizationEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Enforce Hypothetical Reference Decoder: %s",
+ m_UniqueId,
+ IsEnforceHRDEnabled() ? "Enabled" : "Disabled");
+ #pragma endregion Flags
+ #pragma region Video Buffering Verifier
+ PLOG_INFO(PREFIX " Video Buffering Verfier:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Buffer Size: %" PRIu64 " bits",
+ m_UniqueId,
+ GetVBVBufferSize());
+ PLOG_INFO(PREFIX " Initial Fullness: %" PRIu64 " %%",
+ m_UniqueId,
+ (uint64_t)round(GetInitialVBVBufferFullness() * 100.0));
+ #pragma endregion Video Buffering Verifier
+ PLOG_INFO(PREFIX " Max. Access Unit Size: %" PRIu32,
+ m_UniqueId,
+ GetMaximumAccessUnitSize());
+ #pragma endregion Rate Control
+
+ #pragma region Picture Control
+ PLOG_INFO(PREFIX " Picture Control:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Period:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " IDR: %" PRIu32 " GOPs",
+ m_UniqueId,
+ GetIDRPeriod());
+ PLOG_INFO(PREFIX " I: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetIFramePeriod());
+ PLOG_INFO(PREFIX " P: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetPFramePeriod());
+ PLOG_INFO(PREFIX " B: %" PRIu32 " Frames",
+ m_UniqueId,
+ GetBFramePeriod());
+ PLOG_INFO(PREFIX " GOP:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Type: %s",
+ m_UniqueId,
+ Utility::GOPTypeToString(GetGOPType()));
+ PLOG_INFO(PREFIX " Size: %" PRIu32,
+ m_UniqueId,
+ GetGOPSize());
+ PLOG_INFO(PREFIX " Size Range: %" PRIu32 " - %" PRIu32,
+ m_UniqueId,
+ GetGOPSizeMin(),
+ GetGOPSizeMax());
+ PLOG_INFO(PREFIX " Alignment: %s",
+ m_UniqueId,
+ IsGOPAlignmentEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Deblocking Filter: %s",
+ m_UniqueId,
+ IsDeblockingFilterEnabled() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Motion Estimation: %s%s",
+ m_UniqueId,
+ IsMotionEstimationQuarterPixelEnabled() ? (IsMotionEstimationHalfPixelEnabled() ? "Quarter, " : "Quarter") : "",
+ IsMotionEstimationHalfPixelEnabled() ? "Half" : "");
+ #pragma endregion Picture Control
+
+ #pragma region Intra-Refresh
+ PLOG_INFO(PREFIX " Intra-Refresh:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Mode: %" PRIu32,
+ m_UniqueId,
+ GetIntraRefreshMode());
+ PLOG_INFO(PREFIX " Frame Number: %" PRIu32,
+ m_UniqueId,
+ GetIntraRefreshFrameNum());
+ #pragma endregion Intra-Refresh
+
+ #pragma region Slicing
+ PLOG_INFO(PREFIX " Slicing:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " Slices Per Frame: %" PRIu32,
+ m_UniqueId,
+ GetSlicesPerFrame());
+ PLOG_INFO(PREFIX " Control Mode: %s",
+ m_UniqueId,
+ Utility::SliceControlModeToString(GetSliceControlMode()));
+ PLOG_INFO(PREFIX " Control Size: %" PRIu32,
+ m_UniqueId,
+ GetSliceControlSize());
+ #pragma endregion Slicing
+
+ #pragma region Experimental
+ PLOG_INFO(PREFIX " Experimental:",
+ m_UniqueId);
+ PLOG_INFO(PREFIX " QPCBOffset: %" PRIu32,
+ m_UniqueId,
+ GetQPCBOffset());
+ PLOG_INFO(PREFIX " QPCROffset: %" PRIu32,
+ m_UniqueId,
+ GetQPCROffset());
+ PLOG_INFO(PREFIX " Input Queue: %" PRIu32,
+ m_UniqueId,
+ GetInputQueueSize());
+ PLOG_INFO(PREFIX " Low Latency: %s",
+ m_UniqueId,
+ GetLowLatencyInternal() ? "Enabled" : "Disabled");
+ PLOG_INFO(PREFIX " Ultra Low Latency: %s",
+ m_UniqueId,
+ GetCommonLowLatencyInternal() ? "Enabled" : "Disabled");
+ #pragma endregion Experimental
+
+ //PLOG_INFO(PREFIX " ");
+ //PLOG_INFO(PREFIX " ");
+}
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/amf-encoder.cpp
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#include "amf-encoder.h"
+#include "utility.h"
+#include "components/VideoConverter.h"
+#ifdef WITH_AVC
+#include "components/VideoEncoderVCE.h"
+#endif
+#ifdef WITH_HEVC
+#include "components/VideoEncoderHEVC.h"
+#endif
+#include <thread>
+#include "libobs/util/threading.h"
+
+using namespace Plugin;
+using namespace Plugin::AMD;
+
+Plugin::AMD::Encoder::Encoder(Codec codec,
+ std::shared_ptr<API::IAPI> videoAPI, API::Adapter videoAdapter,
+ bool useOpenCLSubmission, bool useOpenCLConversion,
+ ColorFormat colorFormat, ColorSpace colorSpace, bool fullRangeColor,
+ bool useAsyncQueue, size_t asyncQueueSize) {
+ #pragma region Null Values
+ m_UniqueId = Utility::GetUniqueIdentifier();
+ /// AMF Internals
+ m_AMF = nullptr;
+ m_AMFFactory = nullptr;
+ m_AMFContext = nullptr;
+ m_AMFEncoder = nullptr;
+ m_AMFConverter = nullptr;
+ m_AMFMemoryType = amf::AMF_MEMORY_UNKNOWN;
+ m_AMFSurfaceFormat = Utility::ColorFormatToAMF(colorFormat);
+ /// API Related
+ m_API = nullptr;
+ m_APIDevice = nullptr;
+ m_OpenCLSubmission = false;
+ /// Properties
+ m_Codec = codec;
+ m_ColorFormat = colorFormat;
+ m_ColorSpace = colorSpace;
+ m_FullColorRange = fullRangeColor;
+ m_Resolution = std::make_pair<uint32_t, uint32_t>(0, 0);
+ m_FrameRate = std::make_pair<uint32_t, uint32_t>(0, 0);
+ m_TimestampStep = 0;
+ m_TimestampStepRounded = 0;
+ m_TimestampOffset = 0;
+ /// Flags
+ m_Initialized = true;
+ m_Started = false;
+ m_OpenCL = false;
+ m_OpenCLSubmission = useOpenCLSubmission;
+ m_OpenCLConversion = useOpenCLConversion;
+ m_HaveFirstFrame = false;
+ m_AsyncQueue = useAsyncQueue;
+ m_AsyncQueueSize = asyncQueueSize;
+ #pragma endregion Null Values
+
+ // Initialize selected API on Video Adapter
+ m_API = videoAPI;
+ m_APIAdapter = videoAdapter;
+ m_APIDevice = m_API->CreateInstance(m_APIAdapter);
+
+ // Initialize Advanced Media Framework
+ m_AMF = AMF::Instance();
+ /// Retrieve Factory
+ m_AMFFactory = m_AMF->GetFactory();
+
+ // Create Context for Conversion and Encoding
+ AMF_RESULT res = m_AMFFactory->CreateContext(&m_AMFContext);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Creating a AMF Context failed, error %ls (code %d).",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ /// Initialize Context using selected API
+ switch (m_API->GetType()) {
+ case API::Type::Direct3D11:
+ case API::Type::Direct3D9:
+ break;
+ default:
+ m_API = API::GetAPI(0);
+ switch (m_API->GetType()) {
+ case API::Type::Direct3D9:
+ m_APIAdapter = m_API->EnumerateAdapters()[0];
+ m_APIDevice = m_API->CreateInstance(m_APIAdapter);
+ break;
+ case API::Type::Direct3D11:
+ m_APIAdapter = m_API->EnumerateAdapters()[0];
+ m_APIDevice = m_API->CreateInstance(m_APIAdapter);
+ break;
+ }
+ }
+ switch (m_API->GetType()) {
+ case API::Type::Direct3D9:
+ m_AMFMemoryType = amf::AMF_MEMORY_DX9;
+ res = m_AMFContext->InitDX9(m_APIDevice->GetContext());
+ break;
+ case API::Type::Direct3D11:
+ m_AMFMemoryType = amf::AMF_MEMORY_DX11;
+ res = m_AMFContext->InitDX11(m_APIDevice->GetContext());
+ break;
+ }
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Initializing %s API with Adapter '%s' failed, error %ls (code %d).",
+ m_UniqueId,
+ m_API->GetName().c_str(), m_APIAdapter.Name.c_str(),
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ // Initialize OpenCL (if possible)
+ if (m_OpenCLSubmission || m_OpenCLConversion) {
+ res = m_AMFContext->InitOpenCL();
+ if (res == AMF_OK) {
+ m_OpenCL = true;
+
+ res = m_AMFContext->GetCompute(amf::AMF_MEMORY_OPENCL, &m_AMFCompute);
+ if (res != AMF_OK) {
+ m_OpenCLSubmission = false;
+ m_OpenCLConversion = false;
+
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Retrieving Compute object failed, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ }
+ } else {
+ m_OpenCL = false;
+ m_OpenCLSubmission = false;
+ m_OpenCLConversion = false;
+
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Initialising OpenCL failed, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ }
+ }
+
+ // Create Converter
+ res = m_AMFFactory->CreateComponent(m_AMFContext, AMFVideoConverter, &m_AMFConverter);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Creating frame converter component failed, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ res = m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_MEMORY_TYPE, amf::AMF_MEMORY_UNKNOWN);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to set converter memory type, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ res = m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_OUTPUT_FORMAT, amf::AMF_SURFACE_NV12);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to set converter output format, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+ res = m_AMFConverter->SetProperty(AMF_VIDEO_CONVERTER_COLOR_PROFILE, Utility::ColorSpaceToAMFConverter(m_ColorSpace));
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to set convertor color profile, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ // Create Encoder
+ res = m_AMFFactory->CreateComponent(m_AMFContext, Utility::CodecToAMF(codec), &m_AMFEncoder);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to create %s encoder, error %ls (code %d)",
+ m_UniqueId,
+ Utility::CodecToString(codec),
+ m_AMF->GetTrace()->GetResultText(res),
+ res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ // Show complete initialization in log.
+ QUICK_FORMAT_MESSAGE(notice,
+ "<Id: %lld> Initialized.",
+ m_UniqueId);
+ PLOG_DEBUG("%s", notice.data());
+}
+
+Plugin::AMD::Encoder::~Encoder() {
+ // Destroy AMF Encoder
+ if (m_AMFEncoder) {
+ m_AMFEncoder->Terminate();
+ m_AMFEncoder = nullptr;
+ }
+
+ // Destroy AMF Converter
+ if (m_AMFConverter) {
+ m_AMFConverter->Terminate();
+ m_AMFConverter = nullptr;
+ }
+
+ // Destroy AMF Context
+ if (m_AMFContext) {
+ m_AMFContext->Terminate();
+ m_AMFContext = nullptr;
+ }
+
+ // Destroy API
+ if (m_API) {
+ m_APIDevice = nullptr;
+ m_API = nullptr;
+ }
+
+ m_AMF = nullptr;
+
+ // Show complete initialization in log.
+ QUICK_FORMAT_MESSAGE(notice,
+ "<Id: %lld> Finalized.",
+ m_UniqueId);
+ PLOG_DEBUG("%s", notice.c_str());
+}
+
+uint64_t Plugin::AMD::Encoder::GetUniqueId() {
+ return m_UniqueId;
+}
+
+Plugin::AMD::Codec Plugin::AMD::Encoder::GetCodec() {
+ return m_Codec;
+}
+
+std::shared_ptr<API::IAPI> Plugin::AMD::Encoder::GetVideoAPI() {
+ return m_API;
+}
+
+Plugin::API::Adapter Plugin::AMD::Encoder::GetVideoAdapter() {
+ return m_APIAdapter;
+}
+
+bool Plugin::AMD::Encoder::IsOpenCLEnabled() {
+ return m_OpenCLSubmission;
+}
+
+Plugin::AMD::ColorFormat Plugin::AMD::Encoder::GetColorFormat() {
+ return m_ColorFormat;
+}
+
+Plugin::AMD::ColorSpace Plugin::AMD::Encoder::GetColorSpace() {
+ return m_ColorSpace;
+}
+
+bool Plugin::AMD::Encoder::IsFullRangeColor() {
+ return m_FullColorRange;
+}
+
+bool Plugin::AMD::Encoder::IsAsynchronousQueueEnabled() {
+ return m_AsyncQueue;
+}
+
+size_t Plugin::AMD::Encoder::GetAsynchronousQueueSize() {
+ return m_AsyncQueueSize;
+}
+
+void Plugin::AMD::Encoder::UpdateFrameRateValues() {
+ // 1 Second
+ // 1000 Millisecond
+ // 1000000 Microsecond
+ // 10000000 amf_pts
+ // 1000000000 Nanosecond
+ m_FrameRateFraction = ((double_t)m_FrameRate.second / (double_t)m_FrameRate.first);
+ m_TimestampStep = AMF_SECOND * m_FrameRateFraction;
+ m_TimestampStepRounded = (uint64_t)round(m_TimestampStep);
+ m_SubmitQueryWaitTimer = std::chrono::nanoseconds((uint64_t)round(m_TimestampStep / m_SubmitQueryAttempts / 2));
+}
+
+void Plugin::AMD::Encoder::SetVBVBufferStrictness(double_t v) {
+ AMFTRACECALL;
+
+ auto bitrateCaps = CapsVBVBufferSize();
+ uint64_t looseBitrate = bitrateCaps.second,
+ targetBitrate = 0,
+ strictBitrate = bitrateCaps.first;
+
+ Usage usage = GetUsage();
+ if (usage == Usage::UltraLowLatency) {
+ targetBitrate = clamp(GetTargetBitrate(), bitrateCaps.first, bitrateCaps.second);
+ } else {
+ switch (this->GetRateControlMethod()) {
+ case RateControlMethod::ConstantBitrate:
+ targetBitrate = clamp(GetTargetBitrate(), bitrateCaps.first, bitrateCaps.second);
+ break;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ targetBitrate = max(this->GetTargetBitrate(), this->GetPeakBitrate());
+ break;
+ case RateControlMethod::ConstantQP:
+ targetBitrate = bitrateCaps.second / 2;
+ break;
+ }
+ }
+ strictBitrate = clamp(static_cast<uint64_t>(
+ round(targetBitrate * ((double_t)m_FrameRate.second / (double_t)m_FrameRate.first))
+ ), bitrateCaps.first, targetBitrate);
+
+ // Three-Point Linear Lerp
+ // 0% = looseBitrate, 50% = targetBitrate, 100% = strictBitrate
+ v = clamp(v, 0.0, 1.0);
+ double_t aFadeVal = clamp(v * 2.0, 0.0, 1.0); // 0 - 0.5
+ double_t bFadeVal = clamp(v * 2.0 - 1.0, 0.0, 0.0); // 0.5 - 1.0
+
+ double_t aFade = (looseBitrate * (1.0 - aFadeVal)) + (targetBitrate * aFadeVal);
+ double_t bFade = (aFade * (1.0 - bFadeVal)) + (strictBitrate * bFadeVal);
+
+ uint64_t vbvBufferSize = static_cast<uint64_t>(round(bFade));
+ this->SetVBVBufferSize(vbvBufferSize);
+}
+
+void Plugin::AMD::Encoder::SetIFramePeriod(uint32_t v) {
+ m_PeriodIFrame = v;
+}
+
+uint32_t Plugin::AMD::Encoder::GetIFramePeriod() {
+ return m_PeriodIFrame;
+}
+
+void Plugin::AMD::Encoder::SetPFramePeriod(uint32_t v) {
+ m_PeriodPFrame = v;
+}
+
+uint32_t Plugin::AMD::Encoder::GetPFramePeriod() {
+ return m_PeriodPFrame;
+}
+
+void Plugin::AMD::Encoder::SetBFramePeriod(uint32_t v) {
+ m_PeriodBFrame = v;
+}
+
+uint32_t Plugin::AMD::Encoder::GetBFramePeriod() {
+ return m_PeriodBFrame;
+}
+
+void Plugin::AMD::Encoder::SetFrameSkippingPeriod(uint32_t v) {
+ m_FrameSkipPeriod = v;
+}
+
+uint32_t Plugin::AMD::Encoder::GetFrameSkippingPeriod() {
+ return m_FrameSkipPeriod;
+}
+
+void Plugin::AMD::Encoder::SetFrameSkippingBehaviour(bool v) {
+ m_FrameSkipKeepOnlyNth = v;
+}
+
+bool Plugin::AMD::Encoder::GetFrameSkippingBehaviour() {
+ return m_FrameSkipKeepOnlyNth;
+}
+
+void Plugin::AMD::Encoder::Start() {
+ AMFTRACECALL;
+
+ AMF_RESULT res;
+
+ res = m_AMFConverter->Init(Utility::ColorFormatToAMF(m_ColorFormat), m_Resolution.first, m_Resolution.second);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to initalize converter, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res),
+ res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ res = m_AMFEncoder->Init(amf::AMF_SURFACE_NV12, m_Resolution.first, m_Resolution.second);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Failed to initialize encoder, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+
+ // Threading
+ if (m_AsyncQueue) {
+ m_AsyncSend = new EncoderThreadingData;
+ m_AsyncSend->shutdown = false;
+ m_AsyncSend->wakeupcount = 0;// 2 ^ 32;
+ m_AsyncSend->worker = std::thread(AsyncSendMain, this);
+ m_AsyncRetrieve = new EncoderThreadingData;
+ m_AsyncRetrieve->shutdown = false;
+ m_AsyncRetrieve->wakeupcount = 0;
+ m_AsyncRetrieve->worker = std::thread(AsyncRetrieveMain, this);
+ }
+
+ m_Started = true;
+}
+
+void Plugin::AMD::Encoder::Restart() {
+ AMFTRACECALL;
+
+ AMF_RESULT res = m_AMFEncoder->ReInit(m_Resolution.first, m_Resolution.second);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Could not re-initialize encoder, error %ls (code %d)",
+ m_UniqueId,
+ m_AMF->GetTrace()->GetResultText(res), res);
+ throw std::exception(errMsg.c_str());
+ }
+}
+
+void Plugin::AMD::Encoder::Stop() {
+ AMFTRACECALL;
+
+ if (!m_Started)
+ throw std::logic_error("Can't stop an encoder that isn't running!");
+
+ m_AMFConverter->Drain();
+ m_AMFConverter->Flush();
+ m_AMFEncoder->Drain();
+ m_AMFEncoder->Flush();
+
+ // Threading
+ if (m_AsyncQueue) {
+ {
+ std::unique_lock<std::mutex> lock(m_AsyncRetrieve->mutex);
+ m_AsyncRetrieve->shutdown = true;
+ m_AsyncRetrieve->wakeupcount = 2 ^ 32;
+ m_AsyncRetrieve->condvar.notify_all();
+ }
+ m_AsyncRetrieve->worker.join();
+ delete m_AsyncRetrieve;
+ {
+ std::unique_lock<std::mutex> lock(m_AsyncSend->mutex);
+ m_AsyncSend->shutdown = true;
+ m_AsyncSend->wakeupcount = 2 ^ 32;
+ m_AsyncSend->condvar.notify_all();
+ }
+ m_AsyncSend->worker.join();
+ delete m_AsyncSend;
+ }
+
+ m_Started = false;
+}
+
+bool Plugin::AMD::Encoder::IsStarted() {
+ AMFTRACECALL;
+
+ return m_Started;
+}
+
+bool Plugin::AMD::Encoder::Encode(struct encoder_frame* frame, struct encoder_packet* packet, bool* received_packet) {
+ AMFTRACECALL;
+
+ if (!m_Started)
+ return false;
+
+ amf::AMFSurfacePtr surface = nullptr;
+ amf::AMFDataPtr surface_data = nullptr;
+ amf::AMFDataPtr packet_data = nullptr;
+
+ // Encoding Steps
+ if (!EncodeAllocate(surface))
+ return false;
+ if (!EncodeStore(surface, frame))
+ return false;
+ if (!EncodeConvert(surface, surface_data))
+ return false;
+ if (!EncodeMain(surface_data, packet_data))
+ return false;
+ if (!EncodeLoad(packet_data, packet, received_packet))
+ return false;
+
+ return true;
+}
+
+void Plugin::AMD::Encoder::GetVideoInfo(struct video_scale_info* info) {
+ AMFTRACECALL;
+
+ if (!m_AMFContext || !m_AMFEncoder)
+ throw std::exception("<" __FUNCTION_NAME__ "> Called while not initialized.");
+
+ switch (m_ColorFormat) {
+ // 4:2:0 Formats
+ case ColorFormat::NV12:
+ info->format = VIDEO_FORMAT_NV12;
+ break;
+ case ColorFormat::I420:
+ info->format = VIDEO_FORMAT_I420;
+ break;
+ // 4:2:2 Formats
+ case ColorFormat::YUY2:
+ info->format = VIDEO_FORMAT_YUY2;
+ break;
+ // Uncompressed
+ case ColorFormat::RGBA:
+ info->format = VIDEO_FORMAT_RGBA;
+ break;
+ case ColorFormat::BGRA:
+ info->format = VIDEO_FORMAT_BGRA;
+ break;
+ // Other
+ case ColorFormat::GRAY:
+ info->format = VIDEO_FORMAT_Y800;
+ break;
+ }
+
+ if (m_FullColorRange) { // Only use Full range if actually enabled.
+ info->range = VIDEO_RANGE_FULL;
+ } else {
+ info->range = VIDEO_RANGE_PARTIAL;
+ }
+}
+
+bool Plugin::AMD::Encoder::GetExtraData(uint8_t** extra_data, size_t* size) {
+ AMFTRACECALL;
+
+ if (!m_AMFContext || !m_AMFEncoder)
+ throw std::exception("<" __FUNCTION_NAME__ "> Called while not initialized.");
+
+ amf::AMFVariant var;
+ AMF_RESULT res = GetExtraDataInternal(&var);
+ if (res == AMF_OK && var.type == amf::AMF_VARIANT_INTERFACE) {
+ amf::AMFBufferPtr buf(var.pInterface);
+
+ *size = buf->GetSize();
+ m_ExtraDataBuffer.resize(*size);
+ std::memcpy(m_ExtraDataBuffer.data(), buf->GetNative(), *size);
+ *extra_data = m_ExtraDataBuffer.data();
+
+ return true;
+ }
+ return false;
+}
+
+bool Plugin::AMD::Encoder::EncodeAllocate(OUT amf::AMFSurfacePtr& surface) {
+ AMFTRACECALL;
+
+ AMF_RESULT res;
+ auto clk_start = std::chrono::high_resolution_clock::now();
+
+ // Allocate
+ if (m_OpenCLSubmission) {
+ res = m_AMFContext->AllocSurface(m_AMFMemoryType, m_AMFSurfaceFormat,
+ m_Resolution.first, m_Resolution.second, &surface);
+ } else {
+ // Required when not using OpenCL, can't directly write to GPU memory with memcpy.
+ res = m_AMFContext->AllocSurface(amf::AMF_MEMORY_HOST, m_AMFSurfaceFormat,
+ m_Resolution.first, m_Resolution.second, &surface);
+ }
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Unable to allocate Surface, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_ERROR("%s", errMsg.data());
+ return false;
+ }
+
+ // Performance Tracking
+ auto clk_end = std::chrono::high_resolution_clock::now();
+ uint64_t pf_timestamp = std::chrono::nanoseconds(clk_end.time_since_epoch()).count();
+ uint64_t pf_time = std::chrono::nanoseconds(clk_end - clk_start).count();
+
+ surface->SetProperty(AMF_TIMESTAMP_ALLOCATE, pf_timestamp);
+ surface->SetProperty(AMF_TIME_ALLOCATE, pf_time);
+
+ return true;
+}
+
+bool Plugin::AMD::Encoder::EncodeStore(OUT amf::AMFSurfacePtr& surface, IN struct encoder_frame* frame) {
+ AMFTRACECALL;
+
+ AMF_RESULT res;
+ amf::AMFComputeSyncPointPtr pSyncPoint;
+ auto clk_start = std::chrono::high_resolution_clock::now();
+
+ if (m_OpenCLSubmission) {
+ m_AMFCompute->PutSyncPoint(&pSyncPoint);
+ res = surface->Convert(amf::AMF_MEMORY_OPENCL);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Store] Conversion of Surface to OpenCL failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ }
+
+ size_t planeCount = surface->GetPlanesCount();
+ for (uint8_t i = 0; i < planeCount; i++) {
+ amf::AMFPlanePtr plane = surface->GetPlaneAt(i);
+ int32_t width = plane->GetWidth();
+ int32_t height = plane->GetHeight();
+ int32_t hpitch = plane->GetHPitch();
+
+ if (m_OpenCLSubmission) {
+ static const amf_size l_origin[] = { 0, 0, 0 };
+ const amf_size l_size[] = { (amf_size)width, (amf_size)height, 1 };
+ res = m_AMFCompute->CopyPlaneFromHost(frame->data[i], l_origin, l_size, frame->linesize[i], surface->GetPlaneAt(i), false);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Store] Unable to copy plane %d with OpenCL, error %ls (code %d)",
+ m_UniqueId, i, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ } else {
+ void* plane_nat = plane->GetNative();
+ for (int32_t py = 0; py < height; py++) {
+ int32_t plane_off = py * hpitch;
+ int32_t frame_off = py * frame->linesize[i];
+ std::memcpy(
+ static_cast<void*>(static_cast<uint8_t*>(plane_nat) + plane_off),
+ static_cast<void*>(frame->data[i] + frame_off), frame->linesize[i]);
+ }
+ }
+ }
+
+ if (m_OpenCLSubmission) {
+ res = m_AMFCompute->FinishQueue();
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Store] Failed to finish OpenCL queue, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ pSyncPoint->Wait();
+ }
+ res = surface->Convert(m_AMFMemoryType);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Store] Conversion of Surface failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+
+ // Data Stuff
+ int64_t tsLast = (int64_t)round((frame->pts - 1) * m_TimestampStep);
+ int64_t tsNow = (int64_t)round(frame->pts * m_TimestampStep);
+
+ /// Decode Timestamp
+ surface->SetPts(tsNow);
+ /// Presentation Timestamp
+ surface->SetProperty(AMF_PRESENT_TIMESTAMP, frame->pts);
+ /// Duration
+ surface->SetDuration(tsNow - tsLast);
+ /// Type override
+ std::string printableType = HandleTypeOverride(surface, frame->pts);
+
+ // Performance Tracking
+ auto clk_end = std::chrono::high_resolution_clock::now();
+ uint64_t pf_timestamp = std::chrono::nanoseconds(clk_end.time_since_epoch()).count();
+ uint64_t pf_time = std::chrono::nanoseconds(clk_end - clk_start).count();
+ surface->SetProperty(AMF_TIMESTAMP_STORE, pf_timestamp);
+ surface->SetProperty(AMF_TIME_STORE, pf_time);
+
+ PLOG_DEBUG("<Id: %lld> EncodeStore: PTS(%8lld) DTS(%8lld) TS(%16lld) Duration(%16lld) Type(%s)",
+ m_UniqueId,
+ frame->pts,
+ frame->pts,
+ surface->GetPts(),
+ surface->GetDuration(),
+ printableType.c_str());
+
+ return true;
+}
+
+bool Plugin::AMD::Encoder::EncodeConvert(IN amf::AMFSurfacePtr& surface, OUT amf::AMFDataPtr& data) {
+ AMFTRACECALL;
+
+ AMF_RESULT res;
+ auto clk_start = std::chrono::high_resolution_clock::now();
+
+ if (m_OpenCLConversion) {
+ res = surface->Convert(amf::AMF_MEMORY_OPENCL);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Convert] Conversion of Surface to OpenCL failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ }
+ res = m_AMFConverter->SubmitInput(surface);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Convert] Submit to converter failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ res = m_AMFConverter->QueryOutput(&data);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Convert] Querying output from converter failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ if (m_OpenCLConversion) {
+ res = surface->Convert(m_AMFMemoryType);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Convert] Conversion of Surface failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_WARNING("%s", errMsg.data());
+ return false;
+ }
+ }
+
+ // Performance Tracking
+ auto clk_end = std::chrono::high_resolution_clock::now();
+ uint64_t pf_timestamp = std::chrono::nanoseconds(clk_end.time_since_epoch()).count();
+ uint64_t pf_time = std::chrono::nanoseconds(clk_end - clk_start).count();
+ surface->SetProperty(AMF_TIMESTAMP_CONVERT, pf_timestamp);
+ surface->SetProperty(AMF_TIME_CONVERT, pf_time);
+
+ return true;
+}
+
+bool Plugin::AMD::Encoder::EncodeMain(IN amf::AMFDataPtr& data, OUT amf::AMFDataPtr& packet) {
+ AMFTRACECALL;
+
+ bool frameSubmitted = false,
+ packetRetrieved = false;
+
+ for (uint64_t attempt = 1;
+ ((attempt <= m_SubmitQueryAttempts) && (!frameSubmitted || !m_HaveFirstFrame))
+ || (m_HaveFirstFrame && !packetRetrieved);
+ attempt++) {
+ // Submit
+ if (!frameSubmitted) {
+ if (m_AsyncQueue) { // Asynchronous
+ std::unique_lock<std::mutex> slock(m_AsyncSend->mutex);
+ if (m_AsyncSend->queue.size() < m_AsyncQueueSize) {
+ m_AsyncSend->queue.push(data);
+ m_AsyncSend->wakeupcount++;
+ m_AsyncSend->condvar.notify_one();
+ frameSubmitted = true;
+ } else {
+ m_AsyncSend->wakeupcount++;
+ m_AsyncSend->condvar.notify_one();
+ }
+ } else {
+ // Performance Tracking
+ auto clk = std::chrono::high_resolution_clock::now();
+ uint64_t pf_ts = std::chrono::nanoseconds(clk.time_since_epoch()).count();
+ data->SetProperty(AMF_TIMESTAMP_SUBMIT, pf_ts);
+
+ AMF_RESULT res = m_AMFEncoder->SubmitInput(data);
+ switch (res) {
+ case AMF_INPUT_FULL: // TODO: We don't really have a way to call QueryOutput here...
+ break;
+ case AMF_OK:
+ frameSubmitted = true;
+ break;
+ default:
+ {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Main] Submitting Surface failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_ERROR("%s", errMsg.data());
+ }
+ return false;
+ }
+ }
+ }
+
+ if (!frameSubmitted)
+ std::this_thread::sleep_for(m_SubmitQueryWaitTimer);
+
+ // Retrieve
+ if (!packetRetrieved) {
+ if (m_AsyncQueue) {
+ std::unique_lock<std::mutex> rlock(m_AsyncRetrieve->mutex);
+ if (m_AsyncRetrieve->queue.size() > 0) {
+ packet = m_AsyncRetrieve->queue.front();
+ m_AsyncRetrieve->queue.pop();
+ packetRetrieved = true;
+ m_HaveFirstFrame = true;
+ } else {
+ m_AsyncRetrieve->condvar.notify_one();
+ }
+ } else {
+ AMF_RESULT res = m_AMFEncoder->QueryOutput(&packet);
+ switch (res) {
+ case AMF_REPEAT: // Returned with B-Frames, means that we need more frames.
+ case AMF_NEED_MORE_INPUT: // Same
+ if (!m_HaveFirstFrame)
+ packetRetrieved = true;
+ // TODO: Somehow call SubmitInput here.
+ break;
+ case AMF_OK:
+ m_HaveFirstFrame = true;
+ packetRetrieved = true;
+
+ // Performance Tracking
+ {
+ auto clk = std::chrono::high_resolution_clock::now();
+ uint64_t pf_query = std::chrono::nanoseconds(clk.time_since_epoch()).count(),
+ pf_submit, pf_main;
+ packet->GetProperty(AMF_TIMESTAMP_SUBMIT, &pf_submit);
+ packet->SetProperty(AMF_TIMESTAMP_QUERY, pf_query);
+ pf_main = (pf_query - pf_submit);
+ packet->SetProperty(AMF_TIME_MAIN, pf_main);
+ }
+
+ break;
+ default:
+ {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> [Main] Retrieving Packet failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_ERROR("%s", errMsg.data());
+ }
+ return false;
+ }
+ }
+ }
+
+ if (!packetRetrieved)
+ std::this_thread::sleep_for(m_SubmitQueryWaitTimer);
+ }
+ if (!frameSubmitted) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Input Queue is full, encoder is overloaded!",
+ m_UniqueId);
+ PLOG_WARNING("%s", errMsg.data());
+ }
+ if (m_HaveFirstFrame && !packetRetrieved) {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> No output Packet, encoder is overloaded!",
+ m_UniqueId);
+ PLOG_WARNING("%s", errMsg.data());
+ }
+ return true;
+}
+
+bool Plugin::AMD::Encoder::EncodeLoad(IN amf::AMFDataPtr& data, OUT struct encoder_packet* packet, OUT bool* received_packet) {
+ AMFTRACECALL;
+
+ if (data == nullptr)
+ return true;
+
+ amf::AMFBufferPtr pBuffer = amf::AMFBufferPtr(data);
+ auto clk_start = std::chrono::high_resolution_clock::now();
+
+ // Timestamps
+ packet->type = OBS_ENCODER_VIDEO;
+ /// Present Timestamp
+ data->GetProperty(AMF_PRESENT_TIMESTAMP, &packet->pts);
+ /// Decode Timestamp
+ packet->dts = (int64_t)round((double_t)data->GetPts() / m_TimestampStep) - m_TimestampOffset;
+ /// Data
+ PacketPriorityAndKeyframe(data, packet);
+ packet->size = pBuffer->GetSize();
+ if (m_PacketDataBuffer.size() < packet->size) {
+ size_t newBufferSize = (size_t)exp2(ceil(log2(packet->size)));
+ //AMF_LOG_DEBUG("Packet Buffer was resized to %d byte from %d byte.", newBufferSize, m_PacketDataBuffer.size());
+ m_PacketDataBuffer.resize(newBufferSize);
+ }
+ packet->data = m_PacketDataBuffer.data();
+ std::memcpy(packet->data, pBuffer->GetNative(), packet->size);
+
+ // Performance Tracking
+ auto clk_end = std::chrono::high_resolution_clock::now();
+ uint64_t pf_allocate_ts, pf_allocate_t,
+ pf_store_ts, pf_store_t,
+ pf_convert_ts, pf_convert_t,
+ pf_submit_ts, pf_query_ts, pf_main_t,
+ pf_load_ts, pf_load_t;
+
+ data->GetProperty(AMF_TIMESTAMP_ALLOCATE, &pf_allocate_ts);
+ data->GetProperty(AMF_TIME_ALLOCATE, &pf_allocate_t);
+ data->GetProperty(AMF_TIMESTAMP_STORE, &pf_store_ts);
+ data->GetProperty(AMF_TIME_STORE, &pf_store_t);
+ data->GetProperty(AMF_TIMESTAMP_CONVERT, &pf_convert_ts);
+ data->GetProperty(AMF_TIME_CONVERT, &pf_convert_t);
+ data->GetProperty(AMF_TIMESTAMP_SUBMIT, &pf_submit_ts);
+ data->GetProperty(AMF_TIMESTAMP_QUERY, &pf_query_ts);
+ data->GetProperty(AMF_TIME_MAIN, &pf_main_t);
+ pf_load_ts = std::chrono::nanoseconds(clk_end.time_since_epoch()).count();
+ pf_load_t = std::chrono::nanoseconds(clk_end - clk_start).count();
+
+ std::string printableType = "Unknown";
+ #ifdef WITH_AVC
+ if (m_Codec != Codec::HEVC) {
+ uint64_t type = AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_IDR;
+ data->GetProperty(AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE, &type);
+ switch ((AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_ENUM)type) {
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_IDR:
+ printableType = "IDR";
+ break;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_I:
+ printableType = "I";
+ break;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_P:
+ printableType = "P";
+ break;
+ case AMF_VIDEO_ENCODER_OUTPUT_DATA_TYPE_B:
+ printableType = "B";
+ break;
+ }
+ }
+ #endif
+ #ifdef WITH_HEVC
+ if (m_Codec == Codec::HEVC) {
+ uint64_t type = AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_I;
+ data->GetProperty(AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE, &type);
+ switch ((AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_ENUM)type) {
+ case AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_I:
+ printableType = "I";
+ break;
+ case AMF_VIDEO_ENCODER_HEVC_OUTPUT_DATA_TYPE_P:
+ printableType = "P";
+ break;
+ }
+ }
+ #endif
+
+ PLOG_DEBUG(
+ "<Id: %lld> EncodeLoad: PTS(%8lld) DTS(%8lld) TS(%16lld) Duration(%16lld) Size(%16lld) Type(%s)",
+ m_UniqueId,
+ packet->pts,
+ packet->dts,
+ data->GetPts(),
+ data->GetDuration(),
+ packet->size,
+ printableType.c_str());
+ PLOG_DEBUG("<Id: %lld> Timings: Allocate(%8lld ns) Store(%8lld ns) Convert(%8lld ns) Main(%8lld ns) Load(%8lld ns)",
+ m_UniqueId,
+ pf_allocate_t,
+ pf_store_t,
+ pf_convert_t,
+ pf_main_t,
+ pf_load_t);
+
+ *received_packet = true;
+
+ return true;
+}
+
+int32_t Plugin::AMD::Encoder::AsyncSendMain(Encoder* obj) {
+ os_set_thread_name("AMF Asynchronous Queue Sender");
+ return obj->AsyncSendLocalMain();
+}
+
+int32_t Plugin::AMD::Encoder::AsyncSendLocalMain() {
+ EncoderThreadingData* own = m_AsyncSend;
+
+ std::unique_lock<std::mutex> lock(own->mutex);
+ while (!own->shutdown) {
+ own->condvar.wait(lock, [&own] {
+ return own->shutdown || !own->queue.empty();
+ });
+
+ if (own->queue.empty())
+ continue;
+
+ AMF_RESULT res = m_AMFEncoder->SubmitInput(own->queue.front());
+ switch (res) {
+ case AMF_OK:
+ own->queue.pop();
+ own->wakeupcount--;
+ {
+ std::unique_lock<std::mutex> rlock(m_AsyncRetrieve->mutex);
+ m_AsyncRetrieve->wakeupcount++;
+ }
+ m_AsyncRetrieve->condvar.notify_one();
+ break;
+ case AMF_INPUT_FULL:
+ break;
+ default:
+ {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Submitting Surface failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_ERROR("%s", errMsg.data());
+ }
+ return -1;
+ }
+
+ std::this_thread::sleep_for(m_SubmitQueryWaitTimer);
+ }
+ return 0;
+}
+
+int32_t Plugin::AMD::Encoder::AsyncRetrieveMain(Encoder* obj) {
+ os_set_thread_name("AMF Asynchronous Queue Retriever");
+ return obj->AsyncRetrieveLocalMain();
+}
+
+int32_t Plugin::AMD::Encoder::AsyncRetrieveLocalMain() {
+ EncoderThreadingData* own = m_AsyncRetrieve;
+
+ std::unique_lock<std::mutex> lock(own->mutex);
+ while (!own->shutdown) {
+ if (own->wakeupcount == 0) {
+ own->condvar.wait(lock, [&own] {
+ return own->shutdown || (own->wakeupcount > 0);
+ });
+
+ if (own->wakeupcount == 0)
+ continue;
+ }
+
+ if (own->queue.size() < m_AsyncQueueSize) {
+ amf::AMFDataPtr packet;
+ AMF_RESULT res = m_AMFEncoder->QueryOutput(&packet);
+ switch (res) {
+ case AMF_NEED_MORE_INPUT:
+ case AMF_REPEAT:
+ {
+ std::unique_lock<std::mutex> slock(m_AsyncSend->mutex);
+ if (!m_AsyncSend->queue.empty())
+ m_AsyncSend->condvar.notify_one();
+ }
+ break;
+ case AMF_OK:
+ own->queue.push(packet);
+ own->wakeupcount--;
+
+ // Performance Tracking
+ {
+ auto clk = std::chrono::high_resolution_clock::now();
+ uint64_t pf_query = std::chrono::nanoseconds(clk.time_since_epoch()).count(),
+ pf_submit, pf_main;
+ packet->GetProperty(AMF_TIMESTAMP_SUBMIT, &pf_submit);
+ packet->SetProperty(AMF_TIMESTAMP_QUERY, pf_query);
+ pf_main = (pf_query - pf_submit);
+ packet->SetProperty(AMF_TIME_MAIN, pf_main);
+ }
+ break;
+ default:
+ {
+ QUICK_FORMAT_MESSAGE(errMsg,
+ "<Id: %lld> Retrieving Packet failed, error %ls (code %d)",
+ m_UniqueId, m_AMF->GetTrace()->GetResultText(res), res);
+ PLOG_ERROR("%s", errMsg.data());
+ }
+ return -1;
+ }
+ }
+
+ std::this_thread::sleep_for(m_SubmitQueryWaitTimer);
+ }
+ return 0;
+}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/amf.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/amf.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
class CustomWriter : public amf::AMFTraceWriter {
public:
- virtual void Write(const wchar_t* scope, const wchar_t* message) override {
+ virtual void __cdecl Write(const wchar_t* scope, const wchar_t* message) override {
const wchar_t* realmsg = &(message[(33 + wcslen(scope) + 2)]); // Skip Time & Scope
- int msgLen = (int)wcslen(realmsg) - (sizeof(wchar_t));
+ size_t msgLen = wcslen(realmsg) - (sizeof(wchar_t));
- blog(LOG_INFO, "[AMF Encoder] [%.*ls][%ls] %.*ls",
+ blog(LOG_DEBUG, "[AMF Runtime] [%.*ls][%ls] %.*ls",
12, &(message[11]),
scope,
msgLen, realmsg);
}
- virtual void Flush() override {}
+ virtual void __cdecl Flush() override {}
};
-std::shared_ptr<Plugin::AMD::AMF> Plugin::AMD::AMF::GetInstance() {
- static std::shared_ptr<AMF> __instance = std::make_shared<AMF>();
- static std::mutex __mutex;
+#pragma region Singleton
+static AMF* __instance;
+static std::mutex __instance_mutex;
+void Plugin::AMD::AMF::Initialize() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
+ if (!__instance)
+ __instance = new AMF();
+}
- const std::lock_guard<std::mutex> lock(__mutex);
+AMF* Plugin::AMD::AMF::Instance() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
return __instance;
}
+void Plugin::AMD::AMF::Finalize() {
+ const std::lock_guard<std::mutex> lock(__instance_mutex);
+ if (__instance)
+ delete __instance;
+ __instance = nullptr;
+}
+#pragma endregion Singleton
+
Plugin::AMD::AMF::AMF() {
AMF_RESULT res = AMF_OK;
- // Initialize AMF Library
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Initializing...");
-
#pragma region Null Class Members
m_TimerPeriod = 0;
- m_AMFVersion_Compiler = 0;
+ m_AMFVersion_Plugin = AMF_FULL_VERSION;
m_AMFVersion_Runtime = 0;
m_AMFModule = 0;
AMFInit = nullptr;
#pragma endregion Null Class Members
- /// Load AMF Runtime into Memory.
+ #ifdef _WIN32
+ std::vector<char> verbuf;
+ void* pProductVersion = nullptr;
+ uint32_t lProductVersionSize = 0;
+ #endif
+
+ // Initialize AMF Library
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Initializing...");
+
+ // Load AMF Runtime Library
m_AMFModule = LoadLibraryW(AMF_DLL_NAME);
if (!m_AMFModule) {
- DWORD error = GetLastError();
- std::vector<char> buf(1024);
- sprintf(buf.data(), "Unable to load '%ls', error code %ld.", AMF_DLL_NAME, error);
- throw std::exception(buf.data());
+ QUICK_FORMAT_MESSAGE(msg, "Unable to load '%ls', error code %ld.",
+ AMF_DLL_NAME,
+ GetLastError());
+ throw std::exception(msg.data());
+ } else {
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Loaded '%ls'.", AMF_DLL_NAME);
}
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Loaded '%ls'.", AMF_DLL_NAME);
- #ifdef _WIN32 // Windows: Get Product Version
- void* pProductVersion = "unknown";
- uint32_t lProductVersionSize = 7;
- std::vector<char> verbuf(GetFileVersionInfoSizeW(AMF_DLL_NAME, nullptr));
- if (GetFileVersionInfoW(AMF_DLL_NAME, 0, (DWORD)verbuf.size(), verbuf.data())) {
+
+ // Windows: Get Product Version for Driver Matching
+ #ifdef _WIN32
+ {
+ verbuf.resize(GetFileVersionInfoSizeW(AMF_DLL_NAME, nullptr) * 2);
+ GetFileVersionInfoW(AMF_DLL_NAME, 0, (DWORD)verbuf.size(), verbuf.data());
void* pBlock = verbuf.data();
} *lpTranslate;
UINT cbTranslate = sizeof(LANGANDCODEPAGE);
- if (VerQueryValueA(pBlock, "\\VarFileInfo\\Translation", (LPVOID*)&lpTranslate, &cbTranslate)) {
+ VerQueryValueA(pBlock, "\\VarFileInfo\\Translation", (LPVOID*)&lpTranslate, &cbTranslate);
- std::vector<char> buf(1024);
- sprintf(buf.data(), "%s%04x%04x%s",
- "\\StringFileInfo\\",
- lpTranslate[0].wLanguage,
- lpTranslate[0].wCodePage,
- "\\ProductVersion");
-
- // Retrieve file description for language and code page "i".
- VerQueryValueA(pBlock, buf.data(), &pProductVersion, &lProductVersionSize);
- }
+ std::vector<char> buf(1024);
+ snprintf(buf.data(), buf.size(), "%s%04x%04x%s",
+ "\\StringFileInfo\\",
+ lpTranslate[0].wLanguage,
+ lpTranslate[0].wCodePage,
+ "\\ProductVersion");
+
+ // Retrieve file description for language and code page "i".
+ VerQueryValueA(pBlock, buf.data(), &pProductVersion, &lProductVersionSize);
}
- #endif _WIN32 // Windows: Get Product Version
+ #endif _WIN32
- /// Find Function for Querying AMF Version.
- #pragma region Query AMF Runtime Version
+ // Query Runtime Version
AMFQueryVersion = (AMFQueryVersion_Fn)GetProcAddress(m_AMFModule, AMF_QUERY_VERSION_FUNCTION_NAME);
if (!AMFQueryVersion) {
- DWORD error = GetLastError();
- std::vector<char> buf(1024);
- sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Finding Address of Function '%s' failed with error code %ld.", AMF_QUERY_VERSION_FUNCTION_NAME, error);
- throw std::exception(buf.data());
+ QUICK_FORMAT_MESSAGE(msg, "Incompatible AMF Runtime (could not find '%s'), error code %ld.",
+ AMF_QUERY_VERSION_FUNCTION_NAME,
+ GetLastError());
+ throw std::exception(msg.data());
+ } else {
+ res = AMFQueryVersion(&m_AMFVersion_Runtime);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(msg, "Querying Version failed, error code %ld.",
+ res);
+ throw std::exception(msg.data());
+ }
}
- /// Query Runtime Version
- m_AMFVersion_Compiler = AMF_FULL_VERSION;
- res = AMFQueryVersion(&m_AMFVersion_Runtime);
- if (res != AMF_OK)
- ThrowException("<" __FUNCTION_NAME__ "> Querying Version failed with error code %ld.", res);
- #pragma endregion Query AMF Runtime Version
-
- /// Find Function for Initializing AMF.
+
+ /// Initialize AMF
AMFInit = (AMFInit_Fn)GetProcAddress(m_AMFModule, AMF_INIT_FUNCTION_NAME);
if (!AMFInit) {
- DWORD error = GetLastError();
- std::vector<char> buf(1024);
- sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Finding Address of Function '%s' failed with error code %ld.", AMF_INIT_FUNCTION_NAME, error);
- throw std::exception(buf.data(), error);
+ QUICK_FORMAT_MESSAGE(msg, "Incompatible AMF Runtime (could not find '%s'), error code %ld.",
+ AMF_QUERY_VERSION_FUNCTION_NAME,
+ GetLastError());
+ throw std::exception(msg.data());
} else {
res = AMFInit(m_AMFVersion_Runtime, &m_AMFFactory);
- if (res != AMF_OK)
- ThrowException("<" __FUNCTION_NAME__ "> Initializing AMF Library failed with error code %ld.", res);
+ if (res != AMF_OK) {
+ QUICK_FORMAT_MESSAGE(msg, "Initializing AMF Library failed, error code %ld.",
+ res);
+ throw std::exception(msg.data());
+ }
}
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> AMF Library initialized.");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> AMF Library initialized.");
/// Retrieve Trace Object.
res = m_AMFFactory->GetTrace(&m_AMFTrace);
if (res != AMF_OK) {
- ThrowException("<" __FUNCTION_NAME__ "> Retrieving Trace object failed with error code %ld.", res);
+ QUICK_FORMAT_MESSAGE(msg, "Retrieving AMF Trace class failed, error code %ld.",
+ res);
+ throw std::exception(msg.data());
}
/// Retrieve Debug Object.
res = m_AMFFactory->GetDebug(&m_AMFDebug);
if (res != AMF_OK) {
- ThrowExceptionWithAMFError("<" __FUNCTION_NAME__ "> Retrieving Debug object failed with error code %ls (code %ld).", res);
+ QUICK_FORMAT_MESSAGE(msg, "Retrieving AMF Debug class failed, error code %ld.",
+ res);
+ throw std::exception(msg.data());
}
/// Register Trace Writer and disable Debug Tracing.
this->EnableDebugTrace(false);
// Log success
- AMF_LOG_INFO("Version " PLUGIN_VERSION_TEXT " loaded (Compiled: %d.%d.%d.%d, Runtime: %d.%d.%d.%d, Library: %.*s).",
- (uint16_t)((m_AMFVersion_Compiler >> 48ull) & 0xFFFF),
- (uint16_t)((m_AMFVersion_Compiler >> 32ull) & 0xFFFF),
- (uint16_t)((m_AMFVersion_Compiler >> 16ull) & 0xFFFF),
- (uint16_t)((m_AMFVersion_Compiler & 0xFFFF)),
+ PLOG_INFO("Version %d.%d.%d loaded (Compiled: %d.%d.%d.%d, Runtime: %d.%d.%d.%d, Library: %.*s).",
+ PLUGIN_VERSION_MAJOR,
+ PLUGIN_VERSION_MINOR,
+ PLUGIN_VERSION_PATCH,
+ (uint16_t)((m_AMFVersion_Plugin >> 48ull) & 0xFFFF),
+ (uint16_t)((m_AMFVersion_Plugin >> 32ull) & 0xFFFF),
+ (uint16_t)((m_AMFVersion_Plugin >> 16ull) & 0xFFFF),
+ (uint16_t)((m_AMFVersion_Plugin & 0xFFFF)),
(uint16_t)((m_AMFVersion_Runtime >> 48ull) & 0xFFFF),
(uint16_t)((m_AMFVersion_Runtime >> 32ull) & 0xFFFF),
(uint16_t)((m_AMFVersion_Runtime >> 16ull) & 0xFFFF),
(uint16_t)((m_AMFVersion_Runtime & 0xFFFF)),
- lProductVersionSize, (char *)pProductVersion
+ lProductVersionSize, pProductVersion
);
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Initialized.");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Initialized.");
}
Plugin::AMD::AMF::~AMF() {
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Finalizing.");
-
- /// Unregister Trace Writer
- m_AMFTrace->UnregisterWriter(L"OBSWriter");
- delete m_TraceWriter;
- m_TraceWriter = nullptr;
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Finalizing.");
+ if (m_TraceWriter) {
+ //m_AMFTrace->UnregisterWriter(L"OBSWriter");
+ delete m_TraceWriter;
+ m_TraceWriter = nullptr;
+ }
- // Free Library again
if (m_AMFModule)
FreeLibrary(m_AMFModule);
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Finalized.");
#pragma region Null Class Members
m_TimerPeriod = 0;
- m_AMFVersion_Compiler = 0;
+ m_AMFVersion_Plugin = 0;
m_AMFVersion_Runtime = 0;
m_AMFModule = 0;
AMFQueryVersion = nullptr;
AMFInit = nullptr;
#pragma endregion Null Class Members
-
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Finalized.");
}
amf::AMFFactory* Plugin::AMD::AMF::GetFactory() {
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/api-base.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/api-base.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-base.h"
-
#include "api-d3d9.h"
#include "api-d3d11.h"
#include "api-host.h"
#include <VersionHelpers.h>
#endif
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin::API;
+
+// An Adapter on an API
bool Plugin::API::operator<(const Plugin::API::Adapter & left, const Plugin::API::Adapter & right) {
if (left == right)
return left.Name < right.Name;
return !(left == right);
}
-//////////////////////////////////////////////////////////////////////////
-// API Index
-//////////////////////////////////////////////////////////////////////////
-static std::vector<std::shared_ptr<Base>> s_APIInstances;
+// Instance of an API Adapter
+Plugin::API::Instance::Instance() {}
+
+Plugin::API::Instance::~Instance() {}
+
+// API Interface
+Plugin::API::IAPI::IAPI() {}
+
+Plugin::API::IAPI::~IAPI() {}
+
+Plugin::API::Adapter Plugin::API::IAPI::GetAdapterById(int32_t idLow, int32_t idHigh) {
+ for (auto adapter : EnumerateAdapters()) {
+ if ((adapter.idLow == idLow) && (adapter.idHigh == idHigh))
+ return adapter;
+ }
+ return *(EnumerateAdapters().begin());
+}
-void Plugin::API::Base::Initialize() {
+Plugin::API::Adapter Plugin::API::IAPI::GetAdapterByName(std::string name) {
+ for (auto adapter : EnumerateAdapters()) {
+ if (adapter.Name == name)
+ return adapter;
+ }
+ return *(EnumerateAdapters().begin());
+}
+
+// Static API Stuff
+static std::vector<std::shared_ptr<IAPI>> s_APIInstances;
+void Plugin::API::InitializeAPIs() {
// DirectX 11
#ifdef _WIN32
if (IsWindows8OrGreater()) {
}
#endif
- // OpenGL
- {
- s_APIInstances.insert(s_APIInstances.end(), std::make_shared<OpenGL>());
- }
+ // Mikhail says these are for compatibility only, not actually backends.
+ //// OpenGL
+ //{
+ // s_APIInstances.insert(s_APIInstances.end(), std::make_shared<OpenGL>());
+ //}
- // Host
- {
- s_APIInstances.insert(s_APIInstances.end(), std::make_shared<Host>());
- }
+ //// Host
+ //{
+ // s_APIInstances.insert(s_APIInstances.end(), std::make_shared<Host>());
+ //}
+}
+
+void Plugin::API::FinalizeAPIs() {
+ s_APIInstances.clear();
}
-size_t Plugin::API::Base::GetAPICount() {
+size_t Plugin::API::CountAPIs() {
return s_APIInstances.size();
}
-std::shared_ptr<Base> Plugin::API::Base::GetAPIInstance(size_t index) {
+std::string Plugin::API::GetAPIName(size_t index) {
auto indAPI = s_APIInstances.begin();
- for (size_t n = 0; n < index; n++)
- indAPI++;
+ indAPI + index; // Advanced by x elements.
if (indAPI == s_APIInstances.end())
throw std::exception("Invalid API Index");
- return *indAPI;
+ return indAPI->get()->GetName();
}
-std::string Plugin::API::Base::GetAPIName(size_t index) {
+std::shared_ptr<IAPI> Plugin::API::GetAPI(size_t index) {
auto indAPI = s_APIInstances.begin();
indAPI + index; // Advanced by x elements.
if (indAPI == s_APIInstances.end())
throw std::exception("Invalid API Index");
- return indAPI->get()->GetName();
+ return *indAPI;
}
-
-std::shared_ptr<Base> Plugin::API::Base::GetAPIByName(std::string name) {
+std::shared_ptr<IAPI> Plugin::API::GetAPI(std::string name) {
for (auto api : s_APIInstances) {
if (name == api->GetName()) {
return api;
return *s_APIInstances.begin();
}
-std::vector<std::shared_ptr<Base>> Plugin::API::Base::EnumerateAPIs() {
- return std::vector<std::shared_ptr<Base>>(s_APIInstances);
+std::shared_ptr<IAPI> Plugin::API::GetAPI(Type type) {
+ for (auto api : s_APIInstances) {
+ if (type == api->GetType()) {
+ return api;
+ }
+ }
+ // If none was found, return the first one.
+ return *s_APIInstances.begin();
+}
+
+std::vector<std::shared_ptr<IAPI>> Plugin::API::EnumerateAPIs() {
+ return std::vector<std::shared_ptr<IAPI>>(s_APIInstances);
}
-std::vector<std::string> Plugin::API::Base::EnumerateAPINames() {
+std::vector<std::string> Plugin::API::EnumerateAPINames() {
std::vector<std::string> names;
for (auto api : s_APIInstances) {
names.push_back(api->GetName());
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/api-d3d11.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/api-d3d11.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-d3d11.h"
-
-#include <vector>
-#include <string>
#include <sstream>
#include <stdlib.h>
#include <mutex>
-#include <dxgi.h>
-#include <d3d11.h>
-#include <atlutil.h>
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin::API;
class SingletonDXGI {
HMODULE hModule;
};
-std::string Plugin::API::Direct3D11::GetName() {
- return std::string("Direct3D 11");
-}
-
-std::vector<Adapter> Plugin::API::Direct3D11::EnumerateAdapters() {
+Plugin::API::Direct3D11::Direct3D11() {
auto dxgiInst = SingletonDXGI::GetInstance();
-
- ATL::CComPtr<IDXGIFactory1> dxgiFactory;
- HRESULT hr = dxgiInst->CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&dxgiFactory);
+ HRESULT hr = dxgiInst->CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&m_DXGIFactory);
if (FAILED(hr))
- throw std::exception("<" __FUNCTION_NAME__ "> Failed to enumerate adapters, error code %X.", hr);
+ throw std::exception("<" __FUNCTION_NAME__ "> Unable to create DXGI, error code %X.", hr);
- std::vector<Adapter> adapters;
+ // Enumerate Adapters
IDXGIAdapter1* dxgiAdapter = nullptr;
for (size_t adapterIndex = 0;
- !FAILED(dxgiFactory->EnumAdapters1((UINT)adapterIndex, &dxgiAdapter));
+ !FAILED(m_DXGIFactory->EnumAdapters1((UINT)adapterIndex, &dxgiAdapter));
adapterIndex++) {
DXGI_ADAPTER_DESC1 desc = DXGI_ADAPTER_DESC1();
dxgiAdapter->GetDesc1(&desc);
if (desc.VendorId != 0x1002 /* AMD */)
continue;
- std::vector<char> descBuf(256);
- wcstombs(descBuf.data(), desc.Description, descBuf.size());
- adapters.push_back(Adapter(
+ std::vector<char> buf(1024);
+ snprintf(buf.data(), buf.size(), "%ls (VEN_%04x/DEV_%04x/SUB_%04x/REV_%04x)",
+ desc.Description,
+ desc.VendorId,
+ desc.DeviceId,
+ desc.SubSysId,
+ desc.Revision);
+
+ m_AdapterList.emplace_back(
desc.AdapterLuid.LowPart,
desc.AdapterLuid.HighPart,
- std::string(descBuf.data())
- ));
+ std::string(buf.data())
+ );
}
+}
- return adapters;
+Plugin::API::Direct3D11::~Direct3D11() {
}
-Plugin::API::Adapter Plugin::API::Direct3D11::GetAdapterById(uint32_t idLow, uint32_t idHigh) {
- for (auto adapter : EnumerateAdapters()) {
- if ((adapter.idLow == idLow) && (adapter.idHigh == idHigh))
- return adapter;
- }
- return *(EnumerateAdapters().begin());
+std::string Plugin::API::Direct3D11::GetName() {
+ return std::string("Direct3D 11");
}
-Plugin::API::Adapter Plugin::API::Direct3D11::GetAdapterByName(std::string name) {
- for (auto adapter : EnumerateAdapters()) {
- if (adapter.Name == name)
- return adapter;
- }
- return *(EnumerateAdapters().begin());
+std::vector<Adapter> Plugin::API::Direct3D11::EnumerateAdapters() {
+ // We shouldn't expect HW to change during Runtime, at least not in a normal System.
+ return m_AdapterList;
}
-struct Direct3D11Instance {
- ATL::CComPtr<IDXGIFactory1> factory;
- ATL::CComPtr<ID3D11Device> device;
- ATL::CComPtr<ID3D11DeviceContext> context;
-};
+std::shared_ptr<Instance> Plugin::API::Direct3D11::CreateInstance(Adapter adapter) {
+ //std::lock_guard<std::mutex> lock(m_InstanceMapMutex);
+ //std::pair<int32_t, int32_t> key = std::make_pair(adapter.idLow, adapter.idHigh);
+ //auto inst = m_InstanceMap.find(key);
+ //if (inst != m_InstanceMap.end())
+ // return inst->second;
-void* Plugin::API::Direct3D11::CreateInstanceOnAdapter(Adapter adapter) {
- HRESULT hr;
+ auto inst2 = std::make_shared<Direct3D11Instance>(this, adapter);
+ //m_InstanceMap.insert_or_assign(key, inst2);
+ return inst2;
+}
- auto dxgiInst = SingletonDXGI::GetInstance();
+Plugin::API::Type Plugin::API::Direct3D11::GetType() {
+ return Type::Direct3D11;
+}
- ATL::CComPtr<IDXGIFactory1> dxgiFactory;
- hr = dxgiInst->CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&dxgiFactory);
- if (FAILED(hr)) {
- std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Failed to enumerate adapters, error code %X.", hr);
- throw std::exception(buf.data());
- }
+Plugin::API::Direct3D11Instance::Direct3D11Instance(Direct3D11* api, Adapter adapter) {
+ m_API = api;
+ m_Adapter = adapter;
+ m_Device = nullptr;
+ m_DeviceContext = nullptr;
LUID adapterLUID;
adapterLUID.LowPart = adapter.idLow;
adapterLUID.HighPart = adapter.idHigh;
+ HRESULT hr = E_FAIL;
ATL::CComPtr<IDXGIAdapter> dxgiAdapter;
for (size_t adapterIndex = 0;
- !FAILED(dxgiFactory->EnumAdapters((UINT)adapterIndex, &dxgiAdapter));
+ !FAILED(api->m_DXGIFactory->EnumAdapters((UINT)adapterIndex, &dxgiAdapter));
adapterIndex++) {
DXGI_ADAPTER_DESC desc = DXGI_ADAPTER_DESC();
dxgiAdapter->GetDesc(&desc);
D3D_FEATURE_LEVEL_11_1,
D3D_FEATURE_LEVEL_11_0
};
- ID3D11Device* d3dDevice;
- ID3D11DeviceContext* d3dContext;
for (size_t c = 0; c < 3; c++) {
uint32_t flags = 0;
flags,
featureLevels + 1, _countof(featureLevels) - 1,
D3D11_SDK_VERSION,
- &d3dDevice,
+ &m_Device,
NULL,
- &d3dContext);
+ &m_DeviceContext);
if (SUCCEEDED(hr)) {
break;
} else {
- AMF_LOG_WARNING("<" __FUNCTION_NAME__ "> Unable to create D3D11 device, error code %X (mode %Iu).", hr, c);
+ PLOG_WARNING("<" __FUNCTION_NAME__ "> Unable to create D3D11 device, error code %X (mode %d).", hr, c);
}
}
if (FAILED(hr)) {
std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to create D3D11 device, error code %X.", hr);
- throw std::exception(buf.data());
- }
-
- Direct3D11Instance* instance = new Direct3D11Instance();
- instance->factory = dxgiFactory;
- instance->device = d3dDevice;
- instance->context = d3dContext;
- return instance;
-}
-
-Plugin::API::Adapter Plugin::API::Direct3D11::GetAdapterForInstance(void* pInstance) {
- HRESULT hr;
-
- if (pInstance == nullptr)
- throw std::invalid_argument("instance");
-
- Direct3D11Instance* instance = static_cast<Direct3D11Instance*>(pInstance);
- if (instance == nullptr)
- throw std::invalid_argument("instance");
-
- ATL::CComPtr<IDXGIAdapter> dxgiAdapter;
- hr = instance->device->QueryInterface(&dxgiAdapter);
- if (FAILED(hr)) {
- std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Failed to query Adapter from D3D11 device, error code %X.", hr);
- throw std::exception(buf.data());
- }
-
- DXGI_ADAPTER_DESC adapterDesc;
- hr = dxgiAdapter->GetDesc(&adapterDesc);
- if (FAILED(hr)) {
- std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Failed to get description from DXGI adapter, error code %X.", hr);
+ snprintf(buf.data(), buf.size(), "<" __FUNCTION_NAME__ "> Unable to create D3D11 device, error code %X.", hr);
throw std::exception(buf.data());
}
-
- std::vector<char> descBuf(256);
- wcstombs(descBuf.data(), adapterDesc.Description, descBuf.size());
-
- return Adapter(
- adapterDesc.AdapterLuid.LowPart,
- adapterDesc.AdapterLuid.HighPart,
- std::string(descBuf.data())
- );
}
-void* Plugin::API::Direct3D11::GetContextFromInstance(void* pInstance) {
- if (pInstance == nullptr)
- throw std::invalid_argument("instance");
-
- Direct3D11Instance* instance = static_cast<Direct3D11Instance*>(pInstance);
- if (instance == nullptr)
- throw std::invalid_argument("instance");
+Plugin::API::Direct3D11Instance::~Direct3D11Instance() {
+ if (m_Device)
+ m_Device->Release();
- return instance->device;
+ //std::lock_guard<std::mutex> lock(m_API->m_InstanceMapMutex);
+ //std::pair<int32_t, int32_t> key = std::make_pair(m_Adapter.idLow, m_Adapter.idHigh);
+ //m_API->m_InstanceMap.erase(key);
}
-void Plugin::API::Direct3D11::DestroyInstance(void* pInstance) {
- if (pInstance == nullptr)
- throw std::invalid_argument("instance");
-
- Direct3D11Instance* instance = static_cast<Direct3D11Instance*>(pInstance);
- if (instance == nullptr)
- throw std::invalid_argument("instance");
-
- delete instance;
+Plugin::API::Adapter Plugin::API::Direct3D11Instance::GetAdapter() {
+ return m_Adapter;
}
-Plugin::API::Type Plugin::API::Direct3D11::GetType() {
- return Type::Direct3D11;
+void* Plugin::API::Direct3D11Instance::GetContext() {
+ return m_Device;
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/api-d3d9.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/api-d3d9.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-d3d9.h"
-
#include <mutex>
#include <list>
-#ifdef _DEBUG
-#define D3D_DEBUG_INFO
-#endif
-#pragma comment(lib, "d3d9.lib")
-#include <d3d9.h>
-#include <atlutil.h>
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin::API;
-std::string Plugin::API::Direct3D9::GetName() {
- return std::string("Direct3D 9");
-}
-
-std::vector<Adapter> Plugin::API::Direct3D9::EnumerateAdapters() {
- ATL::CComPtr<IDirect3D9Ex> pD3DEx;
- HRESULT hr = Direct3DCreate9Ex(D3D_SDK_VERSION, &pD3DEx);
+Plugin::API::Direct3D9::Direct3D9() {
+ HRESULT hr = Direct3DCreate9Ex(D3D_SDK_VERSION, &m_Direct3D9Ex);
if (FAILED(hr))
- throw std::exception("<" __FUNCTION_NAME__ "> Failed to enumerate adapters, error code %X.", hr);
-
- std::vector<Adapter> adapters;
+ throw std::exception("<" __FUNCTION_NAME__ "> Failed to create D3D9Ex, error code %X.", hr);
+
std::list<LUID> enumeratedLUIDs;
D3DADAPTER_IDENTIFIER9 adapterIdentifier;
for (size_t adapterIndex = 0;
- !FAILED(pD3DEx->GetAdapterIdentifier((UINT)adapterIndex, 0, &adapterIdentifier));
+ !FAILED(m_Direct3D9Ex->GetAdapterIdentifier((UINT)adapterIndex, 0, &adapterIdentifier));
adapterIndex++) {
if (adapterIdentifier.VendorId != 0x1002 /* AMD */)
continue;
LUID adapterLUID;
- if (FAILED(pD3DEx->GetAdapterLUID((UINT)adapterIndex, &adapterLUID)))
+ if (FAILED(m_Direct3D9Ex->GetAdapterLUID((UINT)adapterIndex, &adapterLUID)))
continue;
bool enumerated = false;
else
enumeratedLUIDs.push_back(adapterLUID);
- adapters.emplace_back(
+ std::vector<char> buf(1024);
+ snprintf(buf.data(), buf.size(), "%s [%s] (VEN_%04x/DEV_%04x/SUB_%04x/REV_%04x)",
+ adapterIdentifier.Description,
+ adapterIdentifier.DeviceName,
+
+ adapterIdentifier.VendorId,
+ adapterIdentifier.DeviceId,
+ adapterIdentifier.SubSysId,
+ adapterIdentifier.Revision);
+
+ m_Adapters.emplace_back(
Adapter(adapterLUID.LowPart, adapterLUID.HighPart,
- std::string(adapterIdentifier.Description)));
+ std::string(buf.data())));
}
+}
- return adapters;
+Plugin::API::Direct3D9::~Direct3D9() {
+ //m_InstanceMap.clear(); // Need to destroy IDirect3D9Device9Ex before IDirect3D9Ex.
+ m_Direct3D9Ex->Release();
}
-Plugin::API::Adapter Plugin::API::Direct3D9::GetAdapterById(uint32_t idLow, uint32_t idHigh) {
- for (auto adapter : EnumerateAdapters()) {
- if ((adapter.idLow == idLow) && (adapter.idHigh == idHigh))
- return adapter;
- }
- return *(EnumerateAdapters().begin());
+std::string Plugin::API::Direct3D9::GetName() {
+ return std::string("Direct3D 9");
}
-Plugin::API::Adapter Plugin::API::Direct3D9::GetAdapterByName(std::string name) {
- for (auto adapter : EnumerateAdapters()) {
- if (adapter.Name == name)
- return adapter;
- }
- return *(EnumerateAdapters().begin());
+Plugin::API::Type Plugin::API::Direct3D9::GetType() {
+ return Type::Direct3D9;
+}
+
+std::vector<Adapter> Plugin::API::Direct3D9::EnumerateAdapters() {
+ return m_Adapters;
}
-struct Direct3D9Instance {
- ATL::CComPtr<IDirect3D9Ex> d3d;
- ATL::CComPtr<IDirect3DDevice9Ex> device;
-};
+std::shared_ptr<Instance> Plugin::API::Direct3D9::CreateInstance(Adapter adapter) {
+ //std::pair<int32_t, int32_t> key = std::make_pair(adapter.idLow, adapter.idHigh);
+ //auto inst = m_InstanceMap.find(key);
+ //if (inst != m_InstanceMap.end())
+ // return inst->second;
-void* Plugin::API::Direct3D9::CreateInstanceOnAdapter(Adapter adapter) {
- ATL::CComPtr<IDirect3D9Ex> pD3DEx;
- HRESULT hr = Direct3DCreate9Ex(D3D_SDK_VERSION, &pD3DEx);
- if (FAILED(hr)) {
- std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to create Direct3D 9 instance, error code %X.", hr);
- throw std::exception(buf.data());
- }
+ auto inst2 = std::make_shared<Direct3D9Instance>(this, adapter);
+ //m_InstanceMap.insert_or_assign(key, inst2);
+ return inst2;
+}
+
+Plugin::API::Direct3D9Instance::Direct3D9Instance(Direct3D9* api, Adapter adapter) {
+ this->m_API = api;
+ this->m_Adapter = adapter;
size_t adapterNum = (size_t)-1;
D3DADAPTER_IDENTIFIER9 adapterIdentifier;
for (size_t adapterIndex = 0;
- !FAILED(pD3DEx->GetAdapterIdentifier((UINT)adapterIndex, 0, &adapterIdentifier));
+ !FAILED(api->m_Direct3D9Ex->GetAdapterIdentifier((UINT)adapterIndex, 0, &adapterIdentifier));
adapterIndex++) {
if (adapterIdentifier.VendorId != 0x1002 /* AMD */)
continue;
LUID adapterLUID;
- if (FAILED(pD3DEx->GetAdapterLUID((UINT)adapterIndex, &adapterLUID)))
+ if (FAILED(api->m_Direct3D9Ex->GetAdapterLUID((UINT)adapterIndex, &adapterLUID)))
continue;
- if ((adapterLUID.LowPart == adapter.idLow)
- && (adapterLUID.HighPart == adapter.idHigh)) {
+ if ((static_cast<int32_t>(adapterLUID.LowPart) == adapter.idLow)
+ && (static_cast<int32_t>(adapterLUID.HighPart) == adapter.idHigh)) {
adapterNum = adapterIndex;
break;
}
}
if (adapterNum == -1)
throw std::invalid_argument("adapter");
-
+
D3DPRESENT_PARAMETERS presentParameters;
std::memset(&presentParameters, 0, sizeof(D3DPRESENT_PARAMETERS));
presentParameters.BackBufferWidth = 0;
D3DCAPS9 ddCaps;
std::memset(&ddCaps, 0, sizeof(ddCaps));
- hr = pD3DEx->GetDeviceCaps((UINT)adapterNum, D3DDEVTYPE_HAL, &ddCaps);
+ HRESULT hr = api->m_Direct3D9Ex->GetDeviceCaps((UINT)adapterNum, D3DDEVTYPE_HAL, &ddCaps);
if (FAILED(hr)) {
std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to query capabilities for D3D9 adapter, error code %X.", hr);
+ snprintf(buf.data(), buf.size(), "<" __FUNCTION_NAME__ "> Unable to query capabilities for D3D9 adapter, error code %X.", hr);
throw std::exception(buf.data());
}
vp = D3DCREATE_SOFTWARE_VERTEXPROCESSING;
}
- ATL::CComPtr<IDirect3DDevice9Ex> pD3DDeviceEx;
- hr = pD3DEx->CreateDeviceEx(
+ hr = api->m_Direct3D9Ex->CreateDeviceEx(
(UINT)adapterNum,
D3DDEVTYPE_HAL,
presentParameters.hDeviceWindow,
vp | D3DCREATE_NOWINDOWCHANGES | D3DCREATE_MULTITHREADED,
&presentParameters,
NULL,
- &pD3DDeviceEx
+ &m_Device
);
if (FAILED(hr)) {
std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to create D3D9 device, error code %X.", hr);
+ snprintf(buf.data(), buf.size(), "<" __FUNCTION_NAME__ "> Unable to create D3D9 device, error code %X.", hr);
throw std::exception(buf.data());
}
+}
- Direct3D9Instance* instance = new Direct3D9Instance();
- instance->d3d = pD3DEx;
- instance->device = pD3DDeviceEx;
- return instance;
+Plugin::API::Direct3D9Instance::~Direct3D9Instance() {
+ //std::pair<int32_t, int32_t> key = std::make_pair(m_Adapter.idLow, m_Adapter.idHigh);
+ //m_API->m_InstanceMap.erase(key);
+
+ //m_Device->Release(); // Can't release/free on AMD hardware?
}
-Plugin::API::Adapter Plugin::API::Direct3D9::GetAdapterForInstance(void* pInstance) {
- if (pInstance == nullptr)
+Plugin::API::Adapter Plugin::API::Direct3D9Instance::GetAdapter() {
+ /*if (pInstance == nullptr)
throw std::invalid_argument("instance");
Direct3D9Instance* instance = static_cast<Direct3D9Instance*>(pInstance);
HRESULT hr = instance->device->GetCreationParameters(&par);
if (FAILED(hr)) {
std::vector<char> buf(1024);
- std::sprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to get adapter from D3D9 device, error code %X.", hr);
+ snprintf(buf.data(), "<" __FUNCTION_NAME__ "> Unable to get adapter from D3D9 device, error code %X.", hr);
throw std::exception(buf.data());
}
auto adapters = Direct3D9::EnumerateAdapters();
if (par.AdapterOrdinal > adapters.size())
return *adapters.begin();
-
+
auto adapter = adapters.begin();
for (size_t n = 0; n < par.AdapterOrdinal; n++)
adapter++;
- return *adapter;
+ return *adapter;*/
+ return m_Adapter;
}
-void* Plugin::API::Direct3D9::GetContextFromInstance(void* pInstance) {
- if (pInstance == nullptr)
- throw std::invalid_argument("instance");
-
- Direct3D9Instance* instance = static_cast<Direct3D9Instance*>(pInstance);
- if (instance == nullptr)
- throw std::invalid_argument("instance");
-
- return instance->device;
-}
-
-void Plugin::API::Direct3D9::DestroyInstance(void* pInstance) {
- if (pInstance == nullptr)
- throw std::invalid_argument("instance");
-
- Direct3D9Instance* instance = static_cast<Direct3D9Instance*>(pInstance);
- if (instance == nullptr)
- throw std::invalid_argument("instance");
-
- delete instance;
-}
-
-Plugin::API::Type Plugin::API::Direct3D9::GetType() {
- return Type::Direct3D9;
+void* Plugin::API::Direct3D9Instance::GetContext() {
+ return m_Device;
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/api-host.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/api-host.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-host.h"
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin::API;
std::string Plugin::API::Host::GetName() {
std::vector<Adapter> Plugin::API::Host::EnumerateAdapters() {
std::vector<Adapter> list;
- list.push_back(Adapter(0, 0, TEXT_T(AMF_UTIL_DEFAULT)));
+ list.push_back(Adapter(0, 0, "Default"));
return list;
}
-Plugin::API::Adapter Plugin::API::Host::GetAdapterById(uint32_t idLow, uint32_t idHigh) {
- return Adapter(0, 0, TEXT_T(AMF_UTIL_DEFAULT));
+std::shared_ptr<Instance> Plugin::API::Host::CreateInstance(Adapter adapter) {
+ return std::make_unique<HostInstance>();
}
-Plugin::API::Adapter Plugin::API::Host::GetAdapterByName(std::string name) {
- return Adapter(0, 0, TEXT_T(AMF_UTIL_DEFAULT));
+Plugin::API::Adapter Plugin::API::HostInstance::GetAdapter() {
+ return Adapter(0, 0, "Default");
}
-void* Plugin::API::Host::CreateInstanceOnAdapter(Adapter adapter) {
+void* Plugin::API::HostInstance::GetContext() {
return nullptr;
}
-
-Plugin::API::Adapter Plugin::API::Host::GetAdapterForInstance(void* pInstance) {
- return Adapter(0, 0, TEXT_T(AMF_UTIL_DEFAULT));
-}
-
-void* Plugin::API::Host::GetContextFromInstance(void* pInstance) {
- throw std::exception("Host API does not have a Context.");
-}
-
-void Plugin::API::Host::DestroyInstance(void* pInstance) {}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/api-opengl.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/api-opengl.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
#include "api-opengl.h"
-
#include <vector>
-#ifdef _WIN32
-#include <windows.h>
-#endif
-#include <gl/GL.h>
-
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin::API;
+Plugin::API::OpenGL::OpenGL() {
+ // ToDo: Adapter enumeration needs to go by Display/Desktop.
+ // - Nvidia is the only one that has GPU Affinity extension.
+ // - Intel perhaps too since they used Nvidia technology. (Until recently at least)
+}
+
+Plugin::API::OpenGL::~OpenGL() {
+
+}
+
std::string Plugin::API::OpenGL::GetName() {
return std::string("OpenGL");
}
+Plugin::API::Type Plugin::API::OpenGL::GetType() {
+ return Type::OpenGL;
+}
+
std::vector<Adapter> Plugin::API::OpenGL::EnumerateAdapters() {
std::vector<Adapter> adapters;
adapters.push_back(Adapter(0, 0, "Default"));
return adapters;
}
-Plugin::API::Adapter Plugin::API::OpenGL::GetAdapterById(uint32_t idLow, uint32_t idHigh) {
- for (auto adapter : EnumerateAdapters()) {
- if ((adapter.idLow == idLow) && (adapter.idHigh == idHigh))
- return adapter;
- }
- return *(EnumerateAdapters().begin());
+std::shared_ptr<Instance> Plugin::API::OpenGL::CreateInstance(Adapter adapter) {
+ // ToDo: Actually create a hidden window and OpenGL context. Not that it is going to be useful.
+ return std::make_unique<OpenGLInstance>();
}
-Plugin::API::Adapter Plugin::API::OpenGL::GetAdapterByName(std::string name) {
- for (auto adapter : EnumerateAdapters()) {
- if (adapter.Name == name)
- return adapter;
- }
- return *(EnumerateAdapters().begin());
-}
+Plugin::API::OpenGLInstance::OpenGLInstance() {
-void* Plugin::API::OpenGL::CreateInstanceOnAdapter(Adapter adapter) {
- return nullptr;
}
-void Plugin::API::OpenGL::DestroyInstance(void* instance) {
- return;
-}
+Plugin::API::OpenGLInstance::~OpenGLInstance() {
-Plugin::API::Adapter Plugin::API::OpenGL::GetAdapterForInstance(void* instance) {
- return *(EnumerateAdapters().begin());
}
-void* Plugin::API::OpenGL::GetContextFromInstance(void* instance) {
- return nullptr;
+Plugin::API::Adapter Plugin::API::OpenGLInstance::GetAdapter() {
+ return Adapter(0, 0, "Default");
}
-Plugin::API::Type Plugin::API::OpenGL::GetType() {
- return Type::OpenGL;
+void* Plugin::API::OpenGLInstance::GetContext() {
+ return nullptr;
}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/enc-h264.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/enc-h264.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
SOFTWARE.
*/
-#pragma once
-
//////////////////////////////////////////////////////////////////////////
-// Includes
+// New UI Design
//////////////////////////////////////////////////////////////////////////
-#include "enc-h264.h"
-#include "misc-util.cpp"
+// All: Preset
+// ----------- Static Section
+// Mas: Usage
+// All: Quality Preset
+// Adv: Profile
+// Adv: Profile Level
+// Mas: Aspect Ratio
+// Exp: Coding Type
+// Exp: Maximum Reference Frames
+// ----------- Rate Control Section
+// All: Rate Control Method
+// Adv: Pre-Pass Encoding (if supported)
+// All, CBR&VBR: Target Bitrate
+// All, VBR: Peak Bitrate
+// All, CQP: QP I/P/B
+// Adv, CBR&VBR: Min/Max QP
+// CBR: Filler Data
+// Adv: Frame Skipping
+// Exp: VBAQ
+// Exp: Enforce HRD
+// ----------- VBV Buffer
+// Adv: VBV Buffer Size
+// Exp: VBV Buffer Initial Fullness
+// ----------- Picture Control
+// All: Keyframe Interval (Float)
+// Mas: IDR Period (Overrides Keyframe Interval if non-zero)
+// Adv: B-Frames (if supported)
+// Adv: B-Frame Delta QP (if supported, not CQP)
+// Adv: B-Frame Reference (if supported and B-Frames enabled)
+// Adv: B-Frame Reference Delta QP (if supported, not CQP)
+// Exp: Deblocking Filter
+// Exp: Motion Estimation (Dropdown)
+// ----------- Intra-Refresh
+// ToDo: Master Mode only?
+// ----------- System
+// Adv: API
+// Adv: Adapter
+// Exp: OpenCL
+// All: View
-#ifdef _WIN32
-#include <VersionHelpers.h>
+#include "enc-h264.h"
+#include "amf-encoder-h264.h"
+#include "amf-capabilities.h"
+#include "strings.h"
+#include "utility.h"
-#include "api-d3d9.h"
-#include "api-d3d11.h"
-#endif
+#define PREFIX "[H264/AVC]"
-//////////////////////////////////////////////////////////////////////////
-// Code
-//////////////////////////////////////////////////////////////////////////
using namespace Plugin;
using namespace Plugin::AMD;
using namespace Plugin::Interface;
-
-enum class Presets : int8_t {
- None = -1,
- ResetToDefaults = 0,
- Recording,
- HighQuality,
- Indistinguishable,
- Lossless,
- Twitch,
- YouTube,
-};
-enum class ViewMode :uint8_t {
- Basic,
- Advanced,
- Expert,
- Master
-};
+using namespace Utility;
void Plugin::Interface::H264Interface::encoder_register() {
- // Ensure that there is a supported AMD GPU.
- bool haveAVCsupport = false;
- for (auto api : Plugin::API::Base::EnumerateAPIs()) {
- for (auto adapter : api->EnumerateAdapters()) {
- auto caps = VCECapabilities::GetInstance()->GetAdapterCapabilities(api, adapter, H264EncoderType::AVC);
- if (caps.acceleration_type != amf::AMF_ACCEL_NOT_SUPPORTED)
- haveAVCsupport = true;
- }
- }
- if (!haveAVCsupport) {
- AMF_LOG_WARNING("No detected GPU supports H264 encoding.");
+ // Test if we actually have AVC support.
+ if (!AMD::CapabilityManager::Instance()->IsCodecSupported(Codec::AVC)) {
+ PLOG_WARNING(PREFIX " Not supported by any GPU, disabling...");
return;
}
encoder_info->get_extra_data = &get_extra_data;
obs_register_encoder(encoder_info.get());
+ PLOG_DEBUG(PREFIX " Registered.");
}
const char* Plugin::Interface::H264Interface::get_name(void*) {
- static const char* name = "H264 Encoder (AMD Advanced Media Framework)";
+ static const char* name = "H264/AVC Encoder (" PLUGIN_NAME ")";
return name;
}
void* Plugin::Interface::H264Interface::create(obs_data_t* settings, obs_encoder_t* encoder) {
- Plugin::Interface::H264Interface* enc = nullptr;
try {
- AMF_LOG_INFO("Starting up...");
- enc = new Plugin::Interface::H264Interface(settings, encoder);
- return enc;
+ return new Plugin::Interface::H264Interface(settings, encoder);
} catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
+ PLOG_ERROR("%s", e.what());
}
- if (enc)
- delete enc;
- return NULL;
+ return nullptr;
}
void Plugin::Interface::H264Interface::destroy(void* data) {
- try {
- AMF_LOG_INFO("Shutting down...");
- Plugin::Interface::H264Interface* enc = static_cast<Plugin::Interface::H264Interface*>(data);
- delete enc;
- } catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- }
- data = nullptr;
+ if (data)
+ delete static_cast<Plugin::Interface::H264Interface*>(data);
}
bool Plugin::Interface::H264Interface::encode(void *data, struct encoder_frame *frame, struct encoder_packet *packet, bool *received_packet) {
try {
- return static_cast<Plugin::Interface::H264Interface*>(data)->encode(frame, packet, received_packet);
+ if (data)
+ return static_cast<Plugin::Interface::H264Interface*>(data)->encode(frame, packet, received_packet);
} catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- throw;
+ PLOG_ERROR("%s", e.what());
}
return false;
}
obs_data_set_default_string(data, "rate_control", "");
obs_data_set_default_string(data, "profile", "");
obs_data_set_default_string(data, "preset", "");
- obs_data_set_int(data, "bitrate", -1);
- obs_data_set_int(data, "keyint_sec", -1);
- obs_data_set_string(data, "rate_control", "");
- obs_data_set_string(data, "profile", "");
- obs_data_set_string(data, "preset", "");
#pragma endregion OBS - Enforce Streaming Service Restrictions
// Preset
- obs_data_set_default_int(data, AMF_H264_PRESET, static_cast<int32_t>(Presets::None));
+ obs_data_set_default_int(data, P_PRESET, static_cast<int64_t>(Presets::None));
// Static Properties
- obs_data_set_default_int(data, AMF_H264_USAGE, static_cast<int32_t>(H264Usage::Transcoding));
- obs_data_set_default_int(data, AMF_H264_QUALITY_PRESET, static_cast<int32_t>(H264QualityPreset::Balanced));
- obs_data_set_default_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::Main));
- obs_data_set_default_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
+ //obs_data_set_default_int(data, P_USAGE, static_cast<int64_t>(Usage::Transcoding));
+ obs_data_set_default_int(data, P_QUALITYPRESET, static_cast<int64_t>(QualityPreset::Balanced));
+ obs_data_set_default_int(data, P_PROFILE, static_cast<int64_t>(Profile::Main));
+ obs_data_set_default_int(data, P_PROFILELEVEL, static_cast<int64_t>(ProfileLevel::Automatic));
+ //obs_data_set_default_frames_per_second(data, P_ASPECTRATIO, media_frames_per_second{ 1, 1 }, "");
+ obs_data_set_default_int(data, P_CODINGTYPE, static_cast<int64_t>(CodingType::Automatic));
+ obs_data_set_default_int(data, P_MAXIMUMREFERENCEFRAMES, 4);
// Rate Control Properties
- obs_data_set_int(data, "last" vstr(AMF_H264_RATECONTROLMETHOD), -1);
- obs_data_set_default_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantBitrate));
- obs_data_set_default_int(data, AMF_H264_BITRATE_TARGET, 3500);
- obs_data_set_default_int(data, AMF_H264_BITRATE_PEAK, 9000);
- obs_data_set_default_int(data, AMF_H264_QP_MINIMUM, 11);
- obs_data_set_default_int(data, AMF_H264_QP_MAXIMUM, 51);
- obs_data_set_default_int(data, AMF_H264_QP_IFRAME, 22);
- obs_data_set_default_int(data, AMF_H264_QP_PFRAME, 22);
- obs_data_set_default_int(data, AMF_H264_QP_BFRAME, 22);
- obs_data_set_int(data, "last" vstr(AMF_H264_VBVBUFFER), -1);
- obs_data_set_default_int(data, AMF_H264_VBVBUFFER, 0);
- obs_data_set_default_int(data, AMF_H264_VBVBUFFER_SIZE, 3500);
- obs_data_set_default_double(data, AMF_H264_VBVBUFFER_STRICTNESS, 50);
- obs_data_set_default_double(data, AMF_H264_VBVBUFFER_FULLNESS, 100);
- obs_data_set_default_int(data, AMF_H264_MAXIMUMACCESSUNITSIZE, 0);
- obs_data_set_default_int(data, AMF_H264_FILLERDATA, 1);
- obs_data_set_default_int(data, AMF_H264_FRAMESKIPPING, 0);
- obs_data_set_default_int(data, AMF_H264_ENFORCEHRDCOMPATIBILITY, 1);
-
- // Frame Control Properties
- obs_data_set_default_double(data, AMF_H264_KEYFRAME_INTERVAL, 2);
- obs_data_set_default_int(data, AMF_H264_IDR_PERIOD, 60);
- obs_data_set_int(data, "last" vstr(AMF_H264_BFRAME_PATTERN), -1);
- obs_data_set_default_int(data, AMF_H264_BFRAME_PATTERN, static_cast<int32_t>(H264BFramePattern::None));
- obs_data_set_int(data, "last" vstr(AMF_H264_BFRAME_REFERENCE), -1);
- obs_data_set_default_int(data, AMF_H264_BFRAME_REFERENCE, 0);
- obs_data_set_default_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP, 2);
- obs_data_set_default_int(data, AMF_H264_BFRAME_DELTAQP, 4);
- obs_data_set_default_int(data, AMF_H264_DEBLOCKINGFILTER, 1);
-
- // Miscellaneous Control Properties
- obs_data_set_default_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_data_set_default_int(data, AMF_H264_MOTIONESTIMATION, 3);
-
- // Experimental Properties
- obs_data_set_default_int(data, AMF_H264_MAXIMUMLTRFRAMES, 0);
- obs_data_set_default_int(data, AMF_H264_CODINGTYPE, static_cast<int32_t>(H264CodingType::Default));
- obs_data_set_default_int(data, AMF_H264_HEADER_INSERTION_SPACING, 0);
- obs_data_set_default_int(data, AMF_H264_SLICESPERFRAME, 1);
- obs_data_set_default_int(data, AMF_H264_SLICEMODE, static_cast<int32_t>(H264SliceMode::Horizontal));
- obs_data_set_default_int(data, AMF_H264_MAXIMUMSLICESIZE, INT_MAX);
- obs_data_set_default_int(data, AMF_H264_SLICECONTROLMODE, static_cast<int32_t>(H264SliceControlMode::Off));
- obs_data_set_default_int(data, AMF_H264_SLICECONTROLSIZE, 0);
- obs_data_set_default_int(data, AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES, 0);
- obs_data_set_default_int(data, AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT, 0);
- obs_data_set_default_int(data, AMF_H264_WAITFORTASK, 0);
- obs_data_set_default_int(data, AMF_H264_PREANALYSISPASS, 0);
- obs_data_set_default_int(data, AMF_H264_VBAQ, 0);
- obs_data_set_default_int(data, AMF_H264_GOPSIZE, 0);
- obs_data_set_default_int(data, AMF_H264_GOPALIGNMENT, 1);
- obs_data_set_default_int(data, AMF_H264_MAXIMUMREFERENCEFRAMES, 4);
+ obs_data_set_default_int(data, ("last" P_RATECONTROLMETHOD), -1);
+ obs_data_set_default_int(data, P_RATECONTROLMETHOD, static_cast<int64_t>(RateControlMethod::ConstantBitrate));
+ obs_data_set_default_int(data, P_PREPASSMODE, static_cast<int64_t>(PrePassMode::Disabled));
+ obs_data_set_default_int(data, P_BITRATE_TARGET, 3500);
+ obs_data_set_default_int(data, P_BITRATE_PEAK, 9000);
+ obs_data_set_default_int(data, P_QP_IFRAME, 22);
+ obs_data_set_default_int(data, P_QP_PFRAME, 22);
+ obs_data_set_default_int(data, P_QP_BFRAME, 22);
+ obs_data_set_default_int(data, P_QP_MINIMUM, 18);
+ obs_data_set_default_int(data, P_QP_MAXIMUM, 51);
+ obs_data_set_default_int(data, P_FILLERDATA, 1);
+ obs_data_set_default_int(data, P_FRAMESKIPPING, 0);
+ obs_data_set_default_int(data, P_FRAMESKIPPING_PERIOD, 0);
+ obs_data_set_default_int(data, P_FRAMESKIPPING_BEHAVIOUR, 0);
+ obs_data_set_default_int(data, P_VBAQ, 1);
+ obs_data_set_default_int(data, P_ENFORCEHRD, 1);
+
+ // VBV Buffer
+ obs_data_set_default_int(data, ("last" P_VBVBUFFER), -1);
+ obs_data_set_default_int(data, P_VBVBUFFER, 0);
+ obs_data_set_default_int(data, P_VBVBUFFER_SIZE, 3500);
+ obs_data_set_default_double(data, P_VBVBUFFER_STRICTNESS, 50);
+ obs_data_set_default_double(data, P_VBVBUFFER_INITIALFULLNESS, 100);
+
+ // Picture Control
+ obs_data_set_default_double(data, P_INTERVAL_KEYFRAME, 2.0);
+ obs_data_set_default_int(data, P_PERIOD_IDR_H264, 0);
+ obs_data_set_default_double(data, P_INTERVAL_IFRAME, 0.0);
+ obs_data_set_default_int(data, P_PERIOD_IFRAME, 0);
+ obs_data_set_default_double(data, P_INTERVAL_PFRAME, 0.0);
+ obs_data_set_default_int(data, P_PERIOD_PFRAME, 0);
+ obs_data_set_default_double(data, P_INTERVAL_BFRAME, 0.0);
+ obs_data_set_default_int(data, P_PERIOD_BFRAME, 0);
+ obs_data_set_default_int(data, ("last" P_BFRAME_PATTERN), -1);
+ obs_data_set_default_int(data, P_BFRAME_PATTERN, 0);
+ obs_data_set_default_int(data, ("last" P_BFRAME_REFERENCE), -1);
+ obs_data_set_default_int(data, P_BFRAME_REFERENCE, 0);
+ obs_data_set_default_int(data, P_BFRAME_REFERENCEDELTAQP, 2);
+ obs_data_set_default_int(data, P_BFRAME_DELTAQP, 4);
+ obs_data_set_default_int(data, P_DEBLOCKINGFILTER, 1);
+ obs_data_set_default_int(data, P_MOTIONESTIMATION, 3);
// System Properties
- obs_data_set_string(data, "last" vstr(AMF_H264_VIDEOAPI), "");
- obs_data_set_default_string(data, AMF_H264_VIDEOAPI, "");
- obs_data_set_int(data, "last" vstr(AMF_H264_VIDEOADAPTER), 0);
- obs_data_set_default_int(data, AMF_H264_VIDEOADAPTER, 0);
- obs_data_set_default_int(data, AMF_H264_OPENCL, 0);
- obs_data_set_int(data, "last" vstr(AMF_H264_VIEW), -1);
- obs_data_set_default_int(data, AMF_H264_VIEW, static_cast<int32_t>(ViewMode::Basic));
- obs_data_set_default_bool(data, AMF_H264_DEBUG, false);
- obs_data_set_default_int(data, AMF_H264_VERSION, PLUGIN_VERSION_FULL);
+ obs_data_set_default_string(data, ("last" P_VIDEO_API), "");
+ obs_data_set_default_string(data, P_VIDEO_API, "");
+ obs_data_set_default_int(data, ("last" P_VIDEO_ADAPTER), 0);
+ obs_data_set_default_int(data, P_VIDEO_ADAPTER, 0);
+ obs_data_set_default_int(data, P_OPENCL_TRANSFER, 0);
+ obs_data_set_default_int(data, P_OPENCL_CONVERSION, 0);
+ obs_data_set_default_int(data, P_ASYNCHRONOUSQUEUE, 0);
+ obs_data_set_default_int(data, P_ASYNCHRONOUSQUEUE_SIZE, 4);
+ obs_data_set_default_int(data, ("last" P_VIEW), -1);
+ obs_data_set_default_int(data, P_VIEW, static_cast<int64_t>(ViewMode::Basic));
+ obs_data_set_default_bool(data, P_DEBUG, false);
+ obs_data_set_default_int(data, P_VERSION, PLUGIN_VERSION_FULL);
}
static void fill_api_list(obs_property_t* p) {
obs_property_list_clear(p);
- for (auto api : Plugin::API::Base::EnumerateAPIs()) {
- obs_property_list_add_string(p, api->GetName().c_str(), api->GetName().c_str());
+ auto cm = CapabilityManager::Instance();
+
+ for (auto api : Plugin::API::EnumerateAPIs()) {
+ if (cm->IsCodecSupportedByAPI(Codec::AVC, api->GetType()))
+ obs_property_list_add_string(p, api->GetName().c_str(), api->GetName().c_str());
}
}
static void fill_device_list(obs_property_t* p, const char* apiname) {
obs_property_list_clear(p);
- auto api = Plugin::API::Base::GetAPIByName(std::string(apiname));
+
+ auto cm = CapabilityManager::Instance();
+ auto api = Plugin::API::GetAPI(std::string(apiname));
for (auto adapter : api->EnumerateAdapters()) {
- obs_property_list_add_int(p, adapter.Name.c_str(), ((int64_t)adapter.idHigh << 32) + (int64_t)adapter.idLow);
+ union {
+ int32_t id[2];
+ int64_t v;
+ } adapterid = { adapter.idLow, adapter.idHigh };
+ if (cm->IsCodecSupportedByAPIAdapter(Codec::AVC, api->GetType(), adapter))
+ obs_property_list_add_int(p, adapter.Name.c_str(), adapterid.v);
}
}
-obs_properties_t* Plugin::Interface::H264Interface::get_properties(void*) {
+obs_properties_t* Plugin::Interface::H264Interface::get_properties(void* data) {
obs_properties* props = obs_properties_create();
obs_property_t* p;
#pragma region Preset
- p = obs_properties_add_list(props, AMF_H264_PRESET, TEXT_T(AMF_H264_PRESET), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ p = obs_properties_add_list(props, P_PRESET, P_TRANSLATE(P_PRESET), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
obs_property_set_modified_callback(p, properties_modified);
obs_property_list_add_int(p, "", -1);
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_RESETTODEFAULTS), static_cast<int32_t>(Presets::ResetToDefaults));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_RECORDING), static_cast<int32_t>(Presets::Recording));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_HIGHQUALITY), static_cast<int32_t>(Presets::HighQuality));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_INDISTINGUISHABLE), static_cast<int32_t>(Presets::Indistinguishable));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_LOSSLESS), static_cast<int32_t>(Presets::Lossless));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_TWITCH), static_cast<int32_t>(Presets::Twitch));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_PRESET_YOUTUBE), static_cast<int32_t>(Presets::YouTube));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_RESETTODEFAULTS), static_cast<int32_t>(Presets::ResetToDefaults));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_RECORDING), static_cast<int32_t>(Presets::Recording));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_HIGHQUALITY), static_cast<int32_t>(Presets::HighQuality));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_INDISTINGUISHABLE), static_cast<int32_t>(Presets::Indistinguishable));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_LOSSLESS), static_cast<int32_t>(Presets::Lossless));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_TWITCH), static_cast<int32_t>(Presets::Twitch));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PRESET_YOUTUBE), static_cast<int32_t>(Presets::YouTube));
#pragma endregion Preset
- #pragma region Static Properties
- #pragma region Usage
- p = obs_properties_add_list(props, AMF_H264_USAGE, TEXT_T(AMF_H264_USAGE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_USAGE_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_USAGE_TRANSCODING), static_cast<int32_t>(H264Usage::Transcoding));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_USAGE_ULTRALOWLATENCY), static_cast<int32_t>(H264Usage::UltraLowLatency));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_USAGE_LOWLATENCY), static_cast<int32_t>(H264Usage::LowLatency));
- // Webcam requires SVC, which is not something OBSs properties API makes easy to support. Nor would it look like anything usable.
- //obs_property_list_add_int(list, TEXT_T(AMF_H264_USAGE_WEBCAM), VCEUsage_Webcam);
- #pragma endregion Usage
-
+ // Static Properties
#pragma region Quality Preset
- p = obs_properties_add_list(props, AMF_H264_QUALITY_PRESET, TEXT_T(AMF_H264_QUALITY_PRESET), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QUALITY_PRESET_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_QUALITY_PRESET_SPEED), static_cast<int32_t>(H264QualityPreset::Speed));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_QUALITY_PRESET_BALANCED), static_cast<int32_t>(H264QualityPreset::Balanced));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_QUALITY_PRESET_QUALITY), static_cast<int32_t>(H264QualityPreset::Quality));
+ p = obs_properties_add_list(props, P_QUALITYPRESET, P_TRANSLATE(P_QUALITYPRESET), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QUALITYPRESET)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_SPEED), static_cast<int32_t>(QualityPreset::Speed));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_BALANCED), static_cast<int32_t>(QualityPreset::Balanced));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_QUALITY), static_cast<int32_t>(QualityPreset::Quality));
#pragma endregion Quality Preset
- #pragma region Profile
- p = obs_properties_add_list(props, AMF_H264_PROFILE, TEXT_T(AMF_H264_PROFILE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_PROFILE_DESCRIPTION));
- #pragma endregion Profile
-
- #pragma region Profile Level
- p = obs_properties_add_list(props, AMF_H264_PROFILELEVEL, TEXT_T(AMF_H264_PROFILELEVEL), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_PROFILELEVEL_DESCRIPTION));
- #pragma endregion Profile Levels
- #pragma endregion Static Properties
-
- #pragma region Rate Control Properties
- #pragma region Method
- p = obs_properties_add_list(props, AMF_H264_RATECONTROLMETHOD, TEXT_T(AMF_H264_RATECONTROLMETHOD), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_RATECONTROLMETHOD_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_RATECONTROLMETHOD_CQP), static_cast<int32_t>(H264RateControlMethod::ConstantQP));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_RATECONTROLMETHOD_CBR), static_cast<int32_t>(H264RateControlMethod::ConstantBitrate));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_RATECONTROLMETHOD_VBR), static_cast<int32_t>(H264RateControlMethod::VariableBitrate_PeakConstrained));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_RATECONTROLMETHOD_VBR_LAT), static_cast<int32_t>(H264RateControlMethod::VariableBitrate_LatencyConstrained));
- obs_property_set_modified_callback(p, properties_modified);
- #pragma endregion Method
+ #pragma region Profile, Levels
+ p = obs_properties_add_list(props, P_PROFILE, P_TRANSLATE(P_PROFILE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PROFILE)));
+ obs_property_list_add_int(p, "Constrained Baseline", static_cast<int32_t>(Profile::ConstrainedBaseline));
+ obs_property_list_add_int(p, "Baseline", static_cast<int32_t>(Profile::Baseline));
+ obs_property_list_add_int(p, "Main", static_cast<int32_t>(Profile::Main));
+ obs_property_list_add_int(p, "Constrained High", static_cast<int32_t>(Profile::ConstrainedHigh));
+ obs_property_list_add_int(p, "High", static_cast<int32_t>(Profile::High));
+
+ p = obs_properties_add_list(props, P_PROFILELEVEL, P_TRANSLATE(P_PROFILELEVEL), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PROFILELEVEL)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_list_add_int(p, "1.0", static_cast<int32_t>(ProfileLevel::L10));
+ obs_property_list_add_int(p, "1.1", static_cast<int32_t>(ProfileLevel::L11));
+ obs_property_list_add_int(p, "1.2", static_cast<int32_t>(ProfileLevel::L12));
+ obs_property_list_add_int(p, "1.3", static_cast<int32_t>(ProfileLevel::L13));
+ obs_property_list_add_int(p, "2.0", static_cast<int32_t>(ProfileLevel::L20));
+ obs_property_list_add_int(p, "2.1", static_cast<int32_t>(ProfileLevel::L21));
+ obs_property_list_add_int(p, "2.2", static_cast<int32_t>(ProfileLevel::L22));
+ obs_property_list_add_int(p, "3.0", static_cast<int32_t>(ProfileLevel::L30));
+ obs_property_list_add_int(p, "3.1", static_cast<int32_t>(ProfileLevel::L31));
+ obs_property_list_add_int(p, "3.2", static_cast<int32_t>(ProfileLevel::L32));
+ obs_property_list_add_int(p, "4.0", static_cast<int32_t>(ProfileLevel::L40));
+ obs_property_list_add_int(p, "4.1", static_cast<int32_t>(ProfileLevel::L41));
+ obs_property_list_add_int(p, "4.2", static_cast<int32_t>(ProfileLevel::L42));
+ obs_property_list_add_int(p, "5.0", static_cast<int32_t>(ProfileLevel::L50));
+ obs_property_list_add_int(p, "5.1", static_cast<int32_t>(ProfileLevel::L51));
+ obs_property_list_add_int(p, "5.2", static_cast<int32_t>(ProfileLevel::L52));
+ #pragma endregion Profile, Levels
+
+ #pragma region Coding Type
+ p = obs_properties_add_list(props, P_CODINGTYPE, P_TRANSLATE(P_CODINGTYPE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_CODINGTYPE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), static_cast<int32_t>(CodingType::Automatic));
+ obs_property_list_add_int(p, "CABAC", static_cast<int32_t>(CodingType::CABAC));
+ obs_property_list_add_int(p, "CALVC", static_cast<int32_t>(CodingType::CALVC));
+ #pragma endregion Coding Type
- #pragma region Method Parameters
+ #pragma region Maximum Reference Frames
+ p = obs_properties_add_int_slider(props, P_MAXIMUMREFERENCEFRAMES, P_TRANSLATE(P_MAXIMUMREFERENCEFRAMES),
+ 1, 16, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_MAXIMUMREFERENCEFRAMES)));
+ #pragma endregion Maximum Reference Frames
+
+ // Rate Control
+ #pragma region Rate Control Method
+ p = obs_properties_add_list(props, P_RATECONTROLMETHOD, P_TRANSLATE(P_RATECONTROLMETHOD), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_RATECONTROLMETHOD)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_CQP), static_cast<int32_t>(RateControlMethod::ConstantQP));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_CBR), static_cast<int32_t>(RateControlMethod::ConstantBitrate));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_VBR), static_cast<int32_t>(RateControlMethod::PeakConstrainedVariableBitrate));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_VBRLAT), static_cast<int32_t>(RateControlMethod::LatencyConstrainedVariableBitrate));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion Rate Control Method
+
+ #pragma region Pre-Pass
+ p = obs_properties_add_list(props, P_PREPASSMODE, P_TRANSLATE(P_PREPASSMODE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PREPASSMODE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), static_cast<int32_t>(PrePassMode::Disabled));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PREPASSMODE_QUARTER), static_cast<int32_t>(PrePassMode::EnabledAtQuarterScale));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PREPASSMODE_HALF), static_cast<int32_t>(PrePassMode::EnabledAtHalfScale));
+ obs_property_list_add_int(p, P_TRANSLATE(P_PREPASSMODE_FULL), static_cast<int32_t>(PrePassMode::Enabled));
+ #pragma endregion Pre-Pass
+
+ #pragma region Parameters
/// Bitrate Constraints
- p = obs_properties_add_int(props, AMF_H264_BITRATE_TARGET, TEXT_T(AMF_H264_BITRATE_TARGET), 0,
- 1, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BITRATE_TARGET_DESCRIPTION));
- p = obs_properties_add_int(props, AMF_H264_BITRATE_PEAK, TEXT_T(AMF_H264_BITRATE_PEAK), 0,
- 1, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BITRATE_PEAK_DESCRIPTION));
+ p = obs_properties_add_int(props, P_BITRATE_TARGET, P_TRANSLATE(P_BITRATE_TARGET), 0, 1, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BITRATE_TARGET)));
+ p = obs_properties_add_int(props, P_BITRATE_PEAK, P_TRANSLATE(P_BITRATE_PEAK), 0, 1, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BITRATE_PEAK)));
/// Minimum QP, Maximum QP
- p = obs_properties_add_int_slider(props, AMF_H264_QP_MINIMUM, TEXT_T(AMF_H264_QP_MINIMUM), 0, 51, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QP_MINIMUM_DESCRIPTION));
- p = obs_properties_add_int_slider(props, AMF_H264_QP_MAXIMUM, TEXT_T(AMF_H264_QP_MAXIMUM), 0, 51, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QP_MAXIMUM_DESCRIPTION));
+ p = obs_properties_add_int_slider(props, P_QP_MINIMUM, P_TRANSLATE(P_QP_MINIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_MINIMUM)));
+ p = obs_properties_add_int_slider(props, P_QP_MAXIMUM, P_TRANSLATE(P_QP_MAXIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_MAXIMUM)));
/// Method: Constant QP
- p = obs_properties_add_int_slider(props, AMF_H264_QP_IFRAME, TEXT_T(AMF_H264_QP_IFRAME), 0, 51, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QP_IFRAME_DESCRIPTION));
- p = obs_properties_add_int_slider(props, AMF_H264_QP_PFRAME, TEXT_T(AMF_H264_QP_PFRAME), 0, 51, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QP_PFRAME_DESCRIPTION));
- p = obs_properties_add_int_slider(props, AMF_H264_QP_BFRAME, TEXT_T(AMF_H264_QP_BFRAME), 0, 51, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_QP_BFRAME_DESCRIPTION));
- #pragma endregion Method Parameters
+ p = obs_properties_add_int_slider(props, P_QP_IFRAME, P_TRANSLATE(P_QP_IFRAME), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_IFRAME)));
+ p = obs_properties_add_int_slider(props, P_QP_PFRAME, P_TRANSLATE(P_QP_PFRAME), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_PFRAME)));
+ p = obs_properties_add_int_slider(props, P_QP_BFRAME, P_TRANSLATE(P_QP_BFRAME), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_BFRAME)));
+ #pragma endregion Parameters
+
+ #pragma region Filler Data
+ p = obs_properties_add_list(props, P_FILLERDATA, P_TRANSLATE(P_FILLERDATA), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FILLERDATA)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Filler Data
+
+ #pragma region Frame Skipping
+ p = obs_properties_add_list(props, P_FRAMESKIPPING, P_TRANSLATE(P_FRAMESKIPPING), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ p = obs_properties_add_int(props, P_FRAMESKIPPING_PERIOD, P_TRANSLATE(P_FRAMESKIPPING_PERIOD), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING_PERIOD)));
+ p = obs_properties_add_list(props, P_FRAMESKIPPING_BEHAVIOUR, P_TRANSLATE(P_FRAMESKIPPING_BEHAVIOUR), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING_BEHAVIOUR)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_FRAMESKIPPING_SKIPNTH), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_FRAMESKIPPING_KEEPNTH), 1);
+ #pragma endregion Frame Skipping
+
+ #pragma region VBAQ
+ p = obs_properties_add_list(props, P_VBAQ, P_TRANSLATE(P_VBAQ), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBAQ)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion VBAQ
+
+ #pragma region Enforce Hyptothetical Reference Decoder Restrictions
+ p = obs_properties_add_list(props, P_ENFORCEHRD, P_TRANSLATE(P_ENFORCEHRD), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ENFORCEHRD)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Enforce Hyptothetical Reference Decoder Restrictions
+
+ // VBV Buffer
+ #pragma region VBV Buffer Mode
+ p = obs_properties_add_list(props, P_VBVBUFFER, P_TRANSLATE(P_VBVBUFFER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_MANUAL), 1);
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion VBV Buffer Mode
+
+ #pragma region VBV Buffer Strictness
+ p = obs_properties_add_float_slider(props, P_VBVBUFFER_STRICTNESS, P_TRANSLATE(P_VBVBUFFER_STRICTNESS), 0.0, 100.0, 0.1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_STRICTNESS)));
+ #pragma endregion VBV Buffer Strictness
+
+ #pragma region VBV Buffer Size
+ p = obs_properties_add_int_slider(props, P_VBVBUFFER_SIZE, P_TRANSLATE(P_VBVBUFFER_SIZE), 1, 1000000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_SIZE)));
+ #pragma endregion VBV Buffer Size
+
+ #pragma region VBV Buffer Initial Fullness
+ p = obs_properties_add_float_slider(props, P_VBVBUFFER_INITIALFULLNESS, P_TRANSLATE(P_VBVBUFFER_INITIALFULLNESS), 0.0, 100.0, 100.0 / 64.0);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_INITIALFULLNESS)));
+ #pragma endregion VBV Buffer Initial Fullness
+
+ // Picture Control
+ #pragma region Interval and Periods
+ /// Keyframe, IDR
+ p = obs_properties_add_float(props, P_INTERVAL_KEYFRAME, P_TRANSLATE(P_INTERVAL_KEYFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_KEYFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_IDR_H264, P_TRANSLATE(P_PERIOD_IDR_H264), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_IDR_H264)));
+ /// I-Frame
+ p = obs_properties_add_float(props, P_INTERVAL_IFRAME, P_TRANSLATE(P_INTERVAL_IFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_IFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_IFRAME, P_TRANSLATE(P_PERIOD_IFRAME), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_IFRAME)));
+ /// P-Frame
+ p = obs_properties_add_float(props, P_INTERVAL_PFRAME, P_TRANSLATE(P_INTERVAL_PFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_PFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_PFRAME, P_TRANSLATE(P_PERIOD_PFRAME), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_PFRAME)));
+ /// B-Frame
+ p = obs_properties_add_float(props, P_INTERVAL_BFRAME, P_TRANSLATE(P_INTERVAL_BFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_BFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_BFRAME, P_TRANSLATE(P_PERIOD_BFRAME), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_BFRAME)));
+ #pragma endregion Interval and Periods
+
+ #pragma region B-Frames Pattern
+ p = obs_properties_add_int_slider(props, P_BFRAME_PATTERN, P_TRANSLATE(P_BFRAME_PATTERN), 0, 3, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BFRAME_PATTERN)));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion B-Frames Pattern
- #pragma region VBV Buffer
- p = obs_properties_add_list(props, AMF_H264_VBVBUFFER, TEXT_T(AMF_H264_VBVBUFFER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VBVBUFFER_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_AUTOMATIC), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_MANUAL), 1);
+ #pragma region B-Frames Reference
+ p = obs_properties_add_list(props, P_BFRAME_REFERENCE, P_TRANSLATE(P_BFRAME_REFERENCE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BFRAME_REFERENCE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
obs_property_set_modified_callback(p, properties_modified);
- p = obs_properties_add_float_slider(props, AMF_H264_VBVBUFFER_STRICTNESS, TEXT_T(AMF_H264_VBVBUFFER_STRICTNESS), 0.0, 100.0, 0.1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VBVBUFFER_STRICTNESS_DESCRIPTION));
- p = obs_properties_add_int_slider(props, AMF_H264_VBVBUFFER_SIZE, TEXT_T(AMF_H264_VBVBUFFER_SIZE), 1, 1000000, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VBVBUFFER_SIZE_DESCRIPTION));
- p = obs_properties_add_float_slider(props, AMF_H264_VBVBUFFER_FULLNESS, TEXT_T(AMF_H264_VBVBUFFER_FULLNESS), 0.0, 100.0, 100.0 / 64.0);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VBVBUFFER_FULLNESS_DESCRIPTION));
- #pragma endregion VBV Buffer
+ #pragma endregion B-Frames Reference
- /// Max Access Unit Size
- p = obs_properties_add_int_slider(props, AMF_H264_MAXIMUMACCESSUNITSIZE, TEXT_T(AMF_H264_MAXIMUMACCESSUNITSIZE), 0, 100000000, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_MAXIMUMACCESSUNITSIZE_DESCRIPTION));
-
- #pragma region Flags
- /// Filler Data (Only supported by CBR so far)
- p = obs_properties_add_list(props, AMF_H264_FILLERDATA, TEXT_T(AMF_H264_FILLERDATA), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_FILLERDATA_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// Frame Skipping
- p = obs_properties_add_list(props, AMF_H264_FRAMESKIPPING, TEXT_T(AMF_H264_FRAMESKIPPING), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_FRAMESKIPPING_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// Enforce Hypothetical Reference Decoder Compatibility
- p = obs_properties_add_list(props, AMF_H264_ENFORCEHRDCOMPATIBILITY, TEXT_T(AMF_H264_ENFORCEHRDCOMPATIBILITY), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_ENFORCEHRDCOMPATIBILITY_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
- #pragma endregion Flags
- #pragma endregion Rate Control Properties
-
- #pragma region Frame Control Properties
- #pragma region IDR Period / Keyframe Interval
- p = obs_properties_add_float(props, AMF_H264_KEYFRAME_INTERVAL, TEXT_T(AMF_H264_KEYFRAME_INTERVAL), 0, 100, 0.001);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_KEYFRAME_INTERVAL_DESCRIPTION));
- p = obs_properties_add_int(props, AMF_H264_IDR_PERIOD, TEXT_T(AMF_H264_IDR_PERIOD), 1, 1000, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_IDR_PERIOD_DESCRIPTION));
- #pragma endregion IDR Period / Keyframe Interval
+ #pragma region B-Frames Delta QP
+ p = obs_properties_add_int_slider(props, P_BFRAME_REFERENCEDELTAQP, P_TRANSLATE(P_BFRAME_REFERENCEDELTAQP), -10, 10, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BFRAME_REFERENCEDELTAQP)));
- #pragma region B-Frames
- /// B-Frames Pattern
- p = obs_properties_add_int_slider(props, AMF_H264_BFRAME_PATTERN, TEXT_T(AMF_H264_BFRAME_PATTERN),
- static_cast<int32_t>(H264BFramePattern::None),
- static_cast<int32_t>(H264BFramePattern::Three),
- 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BFRAME_PATTERN_DESCRIPTION));
- obs_property_set_modified_callback(p, properties_modified);
- /// Enable Reference to B-Frames (2nd Generation GCN and newer)
- p = obs_properties_add_list(props, AMF_H264_BFRAME_REFERENCE, TEXT_T(AMF_H264_BFRAME_REFERENCE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BFRAME_REFERENCE_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
- obs_property_set_modified_callback(p, properties_modified);
- /// B-Frame Delta QP
- p = obs_properties_add_int_slider(props, AMF_H264_BFRAME_REFERENCEDELTAQP, TEXT_T(AMF_H264_BFRAME_REFERENCEDELTAQP), -10, 10, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BFRAME_REFERENCEDELTAQP_DESCRIPTION));
- p = obs_properties_add_int_slider(props, AMF_H264_BFRAME_DELTAQP, TEXT_T(AMF_H264_BFRAME_DELTAQP), -10, 10, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_BFRAME_DELTAQP_DESCRIPTION));
- #pragma endregion B-Frames
+ p = obs_properties_add_int_slider(props, P_BFRAME_DELTAQP, P_TRANSLATE(P_BFRAME_DELTAQP), -10, 10, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BFRAME_DELTAQP)));
+ #pragma endregion B-Frames Delta QP
- /// De-Blocking Filter
- p = obs_properties_add_list(props, AMF_H264_DEBLOCKINGFILTER, TEXT_T(AMF_H264_DEBLOCKINGFILTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_DEBLOCKINGFILTER_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
- #pragma endregion Frame Control Properties
-
- #pragma region Miscellaneous Control Properties
- /// Scan Type
- p = obs_properties_add_list(props, AMF_H264_SCANTYPE, TEXT_T(AMF_H264_SCANTYPE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_SCANTYPE_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_SCANTYPE_PROGRESSIVE), static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_SCANTYPE_INTERLACED), static_cast<int32_t>(H264ScanType::Interlaced));
-
- /// Motion Estimation
- p = obs_properties_add_list(props, AMF_H264_MOTIONESTIMATION, TEXT_T(AMF_H264_MOTIONESTIMATION), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_MOTIONESTIMATION_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_MOTIONESTIMATION_NONE), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_H264_MOTIONESTIMATION_HALF), 1);
- obs_property_list_add_int(p, TEXT_T(AMF_H264_MOTIONESTIMATION_QUARTER), 2);
- obs_property_list_add_int(p, TEXT_T(AMF_H264_MOTIONESTIMATION_BOTH), 3);
- #pragma endregion Miscellaneous Control Properties
-
- #pragma region Experimental Properties
- #pragma region Coding Type
- p = obs_properties_add_list(props, AMF_H264_CODINGTYPE, TEXT_T(AMF_H264_CODINGTYPE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_CODINGTYPE_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_DEFAULT), static_cast<int32_t>(H264CodingType::Default));
- obs_property_list_add_int(p, "CABAC", static_cast<int32_t>(H264CodingType::CABAC));
- obs_property_list_add_int(p, "CALVC", static_cast<int32_t>(H264CodingType::CALVC));
- #pragma endregion Coding Type
+ #pragma region Deblocking Filter
+ p = obs_properties_add_list(props, P_DEBLOCKINGFILTER, P_TRANSLATE(P_DEBLOCKINGFILTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_DEBLOCKINGFILTER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Deblocking Filter
- #pragma region Long Term Reference Frames
- p = obs_properties_add_int_slider(props, AMF_H264_MAXIMUMLTRFRAMES, TEXT_T(AMF_H264_MAXIMUMLTRFRAMES), 0, 2, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_MAXIMUMLTRFRAMES_DESCRIPTION));
- obs_property_set_modified_callback(p, properties_modified);
- #pragma endregion Long Term Reference Frames
-
- /// Header Insertion Spacing
- p = obs_properties_add_int(props, AMF_H264_HEADER_INSERTION_SPACING, TEXT_T(AMF_H264_HEADER_INSERTION_SPACING), 0, 1000, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_HEADER_INSERTION_SPACING_DESCRIPTION));
-
- #pragma region Slicing
- /// Number of Slices Per Frame
- p = obs_properties_add_int_slider(props, AMF_H264_SLICESPERFRAME, TEXT_T(AMF_H264_SLICESPERFRAME), 1, 8160, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_SLICESPERFRAME_DESCRIPTION));
-
- /// Slice Mode
- p = obs_properties_add_list(props, AMF_H264_SLICEMODE, TEXT_T(AMF_H264_SLICEMODE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_SLICEMODE_DESCRIPTION));
- obs_property_list_add_int(p, "Horizontal", static_cast<int32_t>(H264SliceMode::Horizontal));
- obs_property_list_add_int(p, "Vertical", static_cast<int32_t>(H264SliceMode::Vertical));
-
- /// Maximum Slice Size
- p = obs_properties_add_int_slider(props, AMF_H264_MAXIMUMSLICESIZE, TEXT_T(AMF_H264_MAXIMUMSLICESIZE), 1, INT_MAX, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_MAXIMUMSLICESIZE_DESCRIPTION));
-
- /// Slice Control Mode
- p = obs_properties_add_list(props, AMF_H264_SLICECONTROLMODE, TEXT_T(AMF_H264_SLICECONTROLMODE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_SLICECONTROLMODE_DESCRIPTION));
- obs_property_list_add_int(p, Utility::SliceControlModeAsString(H264SliceControlMode::Off), static_cast<int32_t>(H264SliceControlMode::Off));
- obs_property_list_add_int(p, Utility::SliceControlModeAsString(H264SliceControlMode::Macroblock), static_cast<int32_t>(H264SliceControlMode::Macroblock));
- obs_property_list_add_int(p, Utility::SliceControlModeAsString(H264SliceControlMode::Macroblock_Row), static_cast<int32_t>(H264SliceControlMode::Macroblock_Row));
-
- /// Slice Control Size
- p = obs_properties_add_int_slider(props, AMF_H264_SLICECONTROLSIZE, TEXT_T(AMF_H264_SLICECONTROLSIZE), 0, 34560, 1); // 4096x2160 / 16x16
- obs_property_set_long_description(p, TEXT_T(AMF_H264_SLICECONTROLSIZE_DESCRIPTION));
- #pragma endregion Slicing
-
- #pragma region Intra Refresh
- /// Intra Refresh: Number of Stripes
- p = obs_properties_add_int_slider(props, AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES, TEXT_T(AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES), 0, INT_MAX, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES_DESCRIPTION));
-
- /// Intra Refresh: Macroblocks Per Slot
- p = obs_properties_add_int_slider(props, AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT, TEXT_T(AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT), 0, 34560, 1); // 4096x2160 / 16x16
- obs_property_set_long_description(p, TEXT_T(AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT_DESCRIPTION));
- #pragma endregion Intra Refresh
-
- /// Wait For Task
- p = obs_properties_add_list(props, AMF_H264_WAITFORTASK, TEXT_T(AMF_H264_WAITFORTASK), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_WAITFORTASK_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// Preanalysis Pass
- p = obs_properties_add_list(props, AMF_H264_PREANALYSISPASS, TEXT_T(AMF_H264_PREANALYSISPASS), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_PREANALYSISPASS_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// VBAQ
- p = obs_properties_add_list(props, AMF_H264_VBAQ, TEXT_T(AMF_H264_VBAQ), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VBAQ_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// GOP Size
- p = obs_properties_add_int(props, AMF_H264_GOPSIZE, TEXT_T(AMF_H264_GOPSIZE), 0, INT_MAX, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_GOPSIZE_DESCRIPTION));
-
- /// GOP Alignment
- p = obs_properties_add_list(props, AMF_H264_GOPALIGNMENT, TEXT_T(AMF_H264_GOPALIGNMENT), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_GOPALIGNMENT_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// GOP Size
- p = obs_properties_add_int_slider(props, AMF_H264_MAXIMUMREFERENCEFRAMES, TEXT_T(AMF_H264_MAXIMUMREFERENCEFRAMES), 1, 1, 1);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_MAXIMUMREFERENCEFRAMES_DESCRIPTION));
-
- #pragma endregion Experimental Properties
-
- #pragma region System Properties
- /// Video API
- p = obs_properties_add_list(props, AMF_H264_VIDEOAPI, TEXT_T(AMF_H264_VIDEOAPI), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_STRING);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VIDEOAPI_DESCRIPTION));
+ #pragma region Motion Estimation
+ p = obs_properties_add_list(props, P_MOTIONESTIMATION, P_TRANSLATE(P_MOTIONESTIMATION), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_MOTIONESTIMATION)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_HALF), 1);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_QUARTER), 2);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_FULL), 3);
+ #pragma endregion Motion Estimation
+
+ // System
+ #pragma region Video APIs
+ p = obs_properties_add_list(props, P_VIDEO_API, P_TRANSLATE(P_VIDEO_API), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_STRING);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIDEO_API)));
obs_property_set_modified_callback(p, properties_modified);
fill_api_list(p);
+ #pragma endregion Video APIs
- /// Video Adapter
- p = obs_properties_add_list(props, AMF_H264_VIDEOADAPTER, TEXT_T(AMF_H264_VIDEOADAPTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VIDEOADAPTER_DESCRIPTION));
+ #pragma region Video Adapters
+ p = obs_properties_add_list(props, P_VIDEO_ADAPTER, P_TRANSLATE(P_VIDEO_ADAPTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIDEO_ADAPTER)));
obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion Video Adapters
+
+ #pragma region OpenCL
+ p = obs_properties_add_list(props, P_OPENCL_TRANSFER, P_TRANSLATE(P_OPENCL_TRANSFER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_OPENCL_TRANSFER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+
+ p = obs_properties_add_list(props, P_OPENCL_CONVERSION, P_TRANSLATE(P_OPENCL_CONVERSION), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_OPENCL_CONVERSION)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion OpenCL
+
+ #pragma region Asynchronous Queue
+ p = obs_properties_add_list(props, P_ASYNCHRONOUSQUEUE, P_TRANSLATE(P_ASYNCHRONOUSQUEUE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ASYNCHRONOUSQUEUE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+
+ p = obs_properties_add_int_slider(props, P_ASYNCHRONOUSQUEUE_SIZE, P_TRANSLATE(P_ASYNCHRONOUSQUEUE_SIZE), 1, 32, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ASYNCHRONOUSQUEUE_SIZE)));
+ #pragma endregion Asynchronous Queue
- /// OpenCL
- p = obs_properties_add_list(props, AMF_H264_OPENCL, TEXT_T(AMF_H264_OPENCL), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_OPENCL_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_DISABLED), 0);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_TOGGLE_ENABLED), 1);
-
- /// View Mode
- p = obs_properties_add_list(props, AMF_H264_VIEW, TEXT_T(AMF_H264_VIEW), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
- obs_property_set_long_description(p, TEXT_T(AMF_H264_VIEW_DESCRIPTION));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_VIEW_BASIC), static_cast<int32_t>(ViewMode::Basic));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_VIEW_ADVANCED), static_cast<int32_t>(ViewMode::Advanced));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_VIEW_EXPERT), static_cast<int32_t>(ViewMode::Expert));
- obs_property_list_add_int(p, TEXT_T(AMF_H264_VIEW_MASTER), static_cast<int32_t>(ViewMode::Master));
+ #pragma region View Mode
+ p = obs_properties_add_list(props, P_VIEW, P_TRANSLATE(P_VIEW), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIEW)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_BASIC), static_cast<int32_t>(ViewMode::Basic));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_ADVANCED), static_cast<int32_t>(ViewMode::Advanced));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_EXPERT), static_cast<int32_t>(ViewMode::Expert));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_MASTER), static_cast<int32_t>(ViewMode::Master));
obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion View Mode
/// Debug
- p = obs_properties_add_bool(props, AMF_H264_DEBUG, TEXT_T(AMF_H264_DEBUG));
- obs_property_set_long_description(p, TEXT_T(AMF_H264_DEBUG_DESCRIPTION));
- #pragma endregion System Properties
+ p = obs_properties_add_bool(props, P_DEBUG, P_TRANSLATE(P_DEBUG));
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_DEBUG)));
+
+ // Disable non-dynamic properties if we have an encoder.
+ obs_properties_set_param(props, data, nullptr);
return props;
}
}
static void obs_data_transfer_settings(obs_data_t * data) {
- #pragma region Version Differences
- uint64_t version = obs_data_get_int(data, AMF_H264_VERSION);
+ #define TRANSFER_STRING(xold, xnew) obs_data_set_string(data, xnew, obs_data_get_string(data, xold))
+ #define TRANSFER_FLOAT(xold, xnew) obs_data_set_double(data, xnew, obs_data_get_double(data, xold))
+ #define TRANSFER_INT(xold, xnew) obs_data_set_int(data, xnew, obs_data_get_int(data, xold))
+ #define TRANSFER_BOOL(xold, xnew) obs_data_set_bool(data, xnew, obs_data_get_bool(data, xold))
+
+ uint64_t version = obs_data_get_int(data, P_VERSION);
switch (version) {
case 0x0001000400030005ull:
- obs_data_set_double(data, AMF_H264_VBVBUFFER_STRICTNESS, obs_data_get_double(data, AMF_H264_VBVBUFFER_STRICTNESS) + 50.0);
+ obs_data_set_double(data, "AMF.H264.VBVBuffer.Strictness", obs_data_get_double(data, "AMF.H264.VBVBuffer.Strictness") + 50.0);
+ case 0x0001000400030008ull:
+ TRANSFER_INT("AMF.H264.Preset", P_PRESET);
+
+ // Static
+ //TRANSFER_INT("AMF.H264.Usage", P_USAGE);
+ TRANSFER_INT("AMF.H264.QualityPreset", P_QUALITYPRESET);
+ TRANSFER_INT("AMF.H264.Profile", P_PROFILE);
+ TRANSFER_INT("AMF.H264.ProfileLevel", P_PROFILELEVEL);
+ TRANSFER_INT("AMF.H264.AspectRatio", P_ASPECTRATIO);
+ TRANSFER_INT("AMF.H264.CodingType", P_CODINGTYPE);
+ TRANSFER_INT("AMF.H264.MaximumReferenceFrames", P_MAXIMUMREFERENCEFRAMES);
+
+ // Rate Control
+ TRANSFER_INT("AMF.H264.RateControlMethod", P_RATECONTROLMETHOD);
+ TRANSFER_INT("AMF.H264.PreAnalysisPass", P_PREPASSMODE);
+ TRANSFER_INT("AMF.H264.Bitrate.Target", P_BITRATE_TARGET);
+ TRANSFER_INT("AMF.H264.Bitrate.Peak", P_BITRATE_PEAK);
+ TRANSFER_INT("AMF.H264.QP.IFrame", P_QP_IFRAME);
+ TRANSFER_INT("AMF.H264.QP.PFrame", P_QP_PFRAME);
+ TRANSFER_INT("AMF.H264.QP.BFrame", P_QP_BFRAME);
+ TRANSFER_INT("AMF.H264.QP.Minimum", P_QP_MINIMUM);
+ TRANSFER_INT("AMF.H264.QP.Maximum", P_QP_MAXIMUM);
+ TRANSFER_INT("AMF.H264.FillerData", P_FILLERDATA);
+ TRANSFER_INT("AMF.H264.FrameSkipping", P_FRAMESKIPPING);
+ TRANSFER_INT("AMF.H264.VBAQ", P_VBAQ);
+ TRANSFER_INT("AMF.H264.EnforceHRD", P_ENFORCEHRD);
+
+ // VBV Buffer
+ TRANSFER_INT("AMF.H264.VBVBuffer", P_VBVBUFFER);
+ TRANSFER_FLOAT("AMF.H264.VBVBuffer.Strictness", P_VBVBUFFER_STRICTNESS);
+ TRANSFER_INT("AMF.H264.VBVBuffer.Size", P_VBVBUFFER_SIZE);
+ TRANSFER_FLOAT("AMF.H264.VBVBuffer.Fullness", P_VBVBUFFER_INITIALFULLNESS);
+
+ // Picture Control
+ TRANSFER_FLOAT("AMF.H264.KeyframeInterval", P_INTERVAL_KEYFRAME);
+ TRANSFER_INT("AMF.H264.IDRPeriod", P_PERIOD_IDR_H264);
+ /// GOP Type
+ /// GOP Size
+ /// GOP Alignment
+ TRANSFER_INT("AMF.H264.BFrame.Pattern", P_BFRAME_PATTERN);
+ TRANSFER_INT("AMF.H264.BFrame.DeltaQP", P_BFRAME_DELTAQP);
+ TRANSFER_INT("AMF.H264.BFrame.Reference", P_BFRAME_REFERENCE);
+ TRANSFER_INT("AMF.H264.BFrame.DeltaQP", P_BFRAME_REFERENCEDELTAQP);
+ TRANSFER_INT("AMF.H264.DeblockingFilter", P_DEBLOCKINGFILTER);
+ TRANSFER_INT("AMF.H264.MotionEstimation", P_MOTIONESTIMATION);
+
+ // System
+ TRANSFER_STRING("AMF.H264.VideoAPI", P_VIDEO_API);
+ TRANSFER_INT("AMF.H264.VideoAdapter", P_VIDEO_ADAPTER);
+ TRANSFER_INT("AMF.H264.OpenCL", P_OPENCL_TRANSFER);
+ TRANSFER_INT("AMF.H264.OpenCL", P_OPENCL_CONVERSION);
+ TRANSFER_INT("AMF.H264.View", P_VIEW);
+ TRANSFER_INT("AMF.H264.Debug", P_DEBUG);
+ case 0x0002000000000000ull:
+ TRANSFER_FLOAT("KeyframeInterval", P_INTERVAL_KEYFRAME);
+ TRANSFER_INT("H264.IDRPeriod", P_PERIOD_IDR_H264);
+ TRANSFER_INT("H265.IDRPeriod", P_PERIOD_IDR_H265);
case PLUGIN_VERSION_FULL:
- obs_data_set_int(data, AMF_H264_VERSION, PLUGIN_VERSION_FULL);
+ obs_data_set_int(data, P_VERSION, PLUGIN_VERSION_FULL);
break;
}
- #pragma endregion Version Differences
}
bool Plugin::Interface::H264Interface::properties_modified(obs_properties_t *props, obs_property_t *, obs_data_t *data) {
bool result = false;
obs_property_t* p;
+ // Transfer settings from older Plugin versions to newer ones.
obs_data_transfer_settings(data);
+ #pragma region Video API & Adapter
+ // Video API
+ const char
+ *videoAPI_last = obs_data_get_string(data, ("last" P_VIDEO_API)),
+ *videoAPI_cur = obs_data_get_string(data, P_VIDEO_API);
+ if (strlen(videoAPI_cur) == 0) {
+ p = obs_properties_get(props, P_VIDEO_API);
+ obs_data_set_string(data, P_VIDEO_API, obs_property_list_item_string(p, 0));
+ videoAPI_cur = obs_data_get_string(data, P_VIDEO_API);
+
+ result = true;
+ }
+ /// If a different API was selected, rebuild the device list.
+ if (strcmp(videoAPI_last, videoAPI_cur) != 0) {
+ obs_data_set_string(data, ("last" P_VIDEO_API), videoAPI_cur);
+ fill_device_list(obs_properties_get(props, P_VIDEO_ADAPTER), videoAPI_cur);
+ result = true;
+
+ // Reset Video Adapter to first in list.
+ obs_data_set_int(data, P_VIDEO_ADAPTER,
+ obs_property_list_item_int(obs_properties_get(props, P_VIDEO_ADAPTER), 0));
+ }
+
+ // Video Adapter
+ int64_t
+ videoAdapter_last = obs_data_get_int(data, ("last" P_VIDEO_ADAPTER)),
+ videoAdapter_cur = obs_data_get_int(data, P_VIDEO_ADAPTER);
+ if (videoAdapter_last != videoAdapter_cur) {
+ obs_data_set_int(data, ("last" P_VIDEO_ADAPTER), videoAdapter_cur);
+ result = true;
+
+ auto api = Plugin::API::GetAPI(obs_data_get_string(data, P_VIDEO_API));
+ union {
+ int64_t v;
+ uint32_t id[2];
+ } adapterid = { videoAdapter_cur };
+ auto adapter = api->GetAdapterById(adapterid.id[0], adapterid.id[1]);
+ try {
+ auto enc = EncoderH264(api, adapter);
+
+ #define TEMP_LIMIT_DROPDOWN(func, enm, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ enm tmp_s = static_cast<enm>(obs_data_get_int(data, obs_property_name(tmp_p))); \
+ for (size_t idx = 0; idx < obs_property_list_item_count(tmp_p); idx++) { \
+ bool enabled = false; \
+ enm tmp_v = static_cast<enm>(obs_property_list_item_int(tmp_p, idx)); \
+ for (auto tmp_k : tmp_l) { \
+ if (tmp_k == tmp_v) { \
+ enabled = true; \
+ break; \
+ } \
+ } \
+ obs_property_list_item_disable(tmp_p, idx, !enabled); \
+ if ((enabled == false) && (tmp_s == tmp_v)) \
+ obs_data_default_single(props, data, obs_property_name(tmp_p)); \
+ } \
+ }
+ #define TEMP_LIMIT_SLIDER(func, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ obs_property_int_set_limits(tmp_p, (int)tmp_l.first, (int)tmp_l.second, 1); \
+ }
+ #define TEMP_LIMIT_SLIDER_BITRATE(func, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ obs_property_int_set_limits(tmp_p, (int)tmp_l.first / 1000, (int)tmp_l.second / 1000, 1); \
+ }
+
+ //TEMP_LIMIT_DROPDOWN(CapsUsage, AMD::Usage, P_USAGE);
+ TEMP_LIMIT_DROPDOWN(CapsQualityPreset, AMD::QualityPreset, P_QUALITYPRESET);
+ TEMP_LIMIT_DROPDOWN(CapsProfile, AMD::Profile, P_PROFILE);
+ TEMP_LIMIT_DROPDOWN(CapsProfileLevel, AMD::ProfileLevel, P_PROFILELEVEL);
+ {
+ auto tmp_p = obs_properties_get(props, P_PROFILELEVEL);
+ obs_property_list_item_disable(tmp_p, 0, false);
+ }
+ // Aspect Ratio - No limits, only affects players/transcoders
+ TEMP_LIMIT_DROPDOWN(CapsCodingType, AMD::CodingType, P_CODINGTYPE);
+ TEMP_LIMIT_SLIDER(CapsMaximumReferenceFrames, P_MAXIMUMREFERENCEFRAMES);
+ TEMP_LIMIT_DROPDOWN(CapsRateControlMethod, AMD::RateControlMethod, P_RATECONTROLMETHOD);
+ TEMP_LIMIT_DROPDOWN(CapsPrePassMode, AMD::PrePassMode, P_PREPASSMODE);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsTargetBitrate, P_BITRATE_TARGET);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsPeakBitrate, P_BITRATE_PEAK);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsVBVBufferSize, P_VBVBUFFER_SIZE);
+ {
+ auto bframep = obs_properties_get(props, P_BFRAME_PATTERN);
+ auto bframecaps = enc.CapsBFramePattern();
+ obs_property_int_set_limits(bframep, 0, (int)bframecaps, 1);
+ if (obs_data_get_int(data, obs_property_name(bframep)) > bframecaps) {
+ obs_data_set_int(data, obs_property_name(bframep), bframecaps);
+ }
+ }
+ } catch (const std::exception& e) {
+ PLOG_ERROR("Exception occured while updating capabilities: %s",
+ e.what());
+ }
+ }
+ #pragma endregion Video API & Adapter
+
#pragma region Presets
- Presets lastPreset = static_cast<Presets>(obs_data_get_int(data, "last" vstr(AMF_H264_PRESET))),
- preset = static_cast<Presets>(obs_data_get_int(data, AMF_H264_PRESET));
+ Presets lastPreset = static_cast<Presets>(obs_data_get_int(data, ("last" P_PRESET))),
+ preset = static_cast<Presets>(obs_data_get_int(data, P_PRESET));
if (lastPreset != preset) { // Reset State
obs_property_t* pn = obs_properties_first(props);
do {
const char* name = obs_property_name(pn);
// Do not reset Video Adapter or API.
- if ((strcmp(name, AMF_H264_VIDEOAPI) == 0) || (strcmp(name, AMF_H264_VIDEOADAPTER) == 0))
+ if ((strcmp(name, P_VIDEO_API) == 0) || (strcmp(name, P_VIDEO_ADAPTER) == 0))
continue;
switch (obs_property_get_type(pn)) {
case Presets::Recording:
#pragma region Recording
// Static Properties
- //obs_data_set_int(data, AMF_H264_USAGE, VCEUsage_Transcoding);
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::High));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
- //obs_data_set_int(data, AMF_H264_MAXIMUMLTRFRAMES, obs_data_get_default_int(data, AMF_H264_MAXIMUMLTRFRAMES));
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::High));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::VariableBitrate_LatencyConstrained));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- if (obs_data_get_int(data, AMF_H264_BITRATE_TARGET) < 10000)
- obs_data_set_int(data, AMF_H264_BITRATE_TARGET, 10000);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_BITRATE_TARGET), 10000, 100000, 1);
- obs_data_default_single(props, data, AMF_H264_QP_MINIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MINIMUM), false);
- obs_data_default_single(props, data, AMF_H264_QP_MAXIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MAXIMUM), false);
- obs_data_set_int(data, AMF_H264_BFRAME_DELTAQP, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_DELTAQP), false);
- obs_data_set_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_REFERENCEDELTAQP), false);
- obs_data_set_int(data, AMF_H264_FILLERDATA, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_FILLERDATA), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::LatencyConstrainedVariableBitrate));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ if (obs_data_get_int(data, P_BITRATE_TARGET) < 10000)
+ obs_data_set_int(data, P_BITRATE_TARGET, 10000);
+ obs_data_default_single(props, data, P_QP_MINIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MINIMUM), false);
+ obs_data_default_single(props, data, P_QP_MAXIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MAXIMUM), false);
+ obs_data_set_int(data, P_FILLERDATA, 0);
+ obs_property_set_enabled(obs_properties_get(props, P_FILLERDATA), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
+ if (obs_data_get_double(data, P_INTERVAL_KEYFRAME) < 2)
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion Recording
case Presets::HighQuality:
#pragma region High Quality
// Static Properties
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::High));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::High));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantQP));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- obs_data_set_int(data, AMF_H264_QP_IFRAME, 26);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_IFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_PFRAME, 24);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_PFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_BFRAME, 22);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_BFRAME), false);
- obs_data_set_int(data, AMF_H264_BFRAME_DELTAQP, -2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_DELTAQP), false);
- obs_data_set_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP, -2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_REFERENCEDELTAQP), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::ConstantQP));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ obs_data_set_int(data, P_QP_IFRAME, 18);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_IFRAME), false);
+ obs_data_set_int(data, P_QP_PFRAME, 18);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_PFRAME), false);
+ obs_data_set_int(data, P_QP_BFRAME, 18);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_BFRAME), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
+ if (obs_data_get_double(data, P_INTERVAL_KEYFRAME) < 2)
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion High Quality
case Presets::Indistinguishable:
#pragma region Indistinguishable
// Static Properties
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::High));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::High));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantQP));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- obs_data_set_int(data, AMF_H264_QP_IFRAME, 21);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_IFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_PFRAME, 19);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_PFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_BFRAME, 17);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_BFRAME), false);
- obs_data_set_int(data, AMF_H264_BFRAME_DELTAQP, -2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_DELTAQP), false);
- obs_data_set_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP, -2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_REFERENCEDELTAQP), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::ConstantQP));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ obs_data_set_int(data, P_QP_IFRAME, 15);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_IFRAME), false);
+ obs_data_set_int(data, P_QP_PFRAME, 15);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_PFRAME), false);
+ obs_data_set_int(data, P_QP_BFRAME, 15);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_BFRAME), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
+ if (obs_data_get_double(data, P_INTERVAL_KEYFRAME) < 2)
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion Indistinguishable
case Presets::Lossless:
#pragma region Lossless
// Static Properties
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::High));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::High));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantQP));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- obs_data_set_int(data, AMF_H264_QP_IFRAME, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_IFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_PFRAME, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_PFRAME), false);
- obs_data_set_int(data, AMF_H264_QP_BFRAME, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_BFRAME), false);
- obs_data_set_int(data, AMF_H264_BFRAME_DELTAQP, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_DELTAQP), false);
- obs_data_set_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_REFERENCEDELTAQP), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::ConstantQP));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ obs_data_set_int(data, P_QP_IFRAME, 0);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_IFRAME), false);
+ obs_data_set_int(data, P_QP_PFRAME, 0);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_PFRAME), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
- obs_data_set_int(data, AMF_H264_BFRAME_PATTERN, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_PATTERN), false);
- obs_data_set_int(data, AMF_H264_BFRAME_REFERENCE, 0);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_BFRAME_REFERENCE), false);
+ if (obs_data_get_double(data, P_INTERVAL_KEYFRAME) < 2)
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
+ //obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
+ //obs_property_set_enabled(obs_properties_get(props, P_INTERVAL_KEYFRAME), true);
+ obs_data_set_int(data, P_BFRAME_PATTERN, 0);
+ obs_property_set_enabled(obs_properties_get(props, P_BFRAME_PATTERN), false);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion Lossless
case Presets::Twitch:
#pragma region Twitch
// Static Properties
- obs_data_set_int(data, AMF_H264_USAGE, static_cast<int32_t>(H264Usage::Transcoding));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_USAGE), false);
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::Main));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::Main));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantBitrate));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- if (obs_data_get_int(data, AMF_H264_BITRATE_TARGET) < 500)
- obs_data_set_int(data, AMF_H264_BITRATE_TARGET, 500);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_BITRATE_TARGET), 500, 100000, 1);
- obs_data_default_single(props, data, AMF_H264_QP_MINIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MINIMUM), false);
- obs_data_default_single(props, data, AMF_H264_QP_MAXIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MAXIMUM), false);
- obs_data_set_int(data, AMF_H264_FILLERDATA, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_FILLERDATA), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::ConstantBitrate));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ if (obs_data_get_int(data, P_BITRATE_TARGET) < 500)
+ obs_data_set_int(data, P_BITRATE_TARGET, 500);
+ obs_data_default_single(props, data, P_QP_MINIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MINIMUM), false);
+ obs_data_default_single(props, data, P_QP_MAXIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MAXIMUM), false);
+ obs_data_set_int(data, P_FILLERDATA, 1);
+ obs_property_set_enabled(obs_properties_get(props, P_FILLERDATA), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
+ obs_property_set_enabled(obs_properties_get(props, P_INTERVAL_KEYFRAME), false);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion Twitch
case Presets::YouTube:
#pragma region YouTube
// Static Properties
- obs_data_set_int(data, AMF_H264_USAGE, static_cast<int32_t>(H264Usage::Transcoding));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_USAGE), false);
- obs_data_set_int(data, AMF_H264_PROFILE, static_cast<int32_t>(H264Profile::Main));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILE), false);
- obs_data_set_int(data, AMF_H264_PROFILELEVEL, static_cast<int32_t>(H264ProfileLevel::Automatic));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_PROFILELEVEL), false);
+ //obs_data_set_int(data, P_USAGE, static_cast<int32_t>(Usage::Transcoding));
+ //obs_property_set_enabled(obs_properties_get(props, P_USAGE), false);
+ obs_data_set_int(data, P_PROFILE, static_cast<int32_t>(Profile::Main));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILE), false);
+ obs_data_set_int(data, P_PROFILELEVEL, static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_set_enabled(obs_properties_get(props, P_PROFILELEVEL), false);
// Rate Control Properties
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, static_cast<int32_t>(H264RateControlMethod::ConstantBitrate));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_RATECONTROLMETHOD), false);
- if (obs_data_get_int(data, AMF_H264_BITRATE_TARGET) < 500)
- obs_data_set_int(data, AMF_H264_BITRATE_TARGET, 500);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_BITRATE_TARGET), 500, 100000, 1);
- obs_data_default_single(props, data, AMF_H264_QP_MINIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MINIMUM), false);
- obs_data_default_single(props, data, AMF_H264_QP_MAXIMUM);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_QP_MAXIMUM), false);
- obs_data_set_int(data, AMF_H264_FILLERDATA, 1);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_FILLERDATA), false);
+ obs_data_set_int(data, P_RATECONTROLMETHOD, static_cast<int32_t>(RateControlMethod::ConstantBitrate));
+ obs_property_set_enabled(obs_properties_get(props, P_RATECONTROLMETHOD), false);
+ if (obs_data_get_int(data, P_BITRATE_TARGET) < 500)
+ obs_data_set_int(data, P_BITRATE_TARGET, 500);
+ obs_data_default_single(props, data, P_QP_MINIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MINIMUM), false);
+ obs_data_default_single(props, data, P_QP_MAXIMUM);
+ obs_property_set_enabled(obs_properties_get(props, P_QP_MAXIMUM), false);
+ obs_data_set_int(data, P_FILLERDATA, 1);
+ obs_property_set_enabled(obs_properties_get(props, P_FILLERDATA), false);
// Frame Control Properties
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, 2);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), false);
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, 2);
+ obs_property_set_enabled(obs_properties_get(props, P_INTERVAL_KEYFRAME), false);
// Miscellaneous Properties
- obs_data_set_int(data, AMF_H264_SCANTYPE, static_cast<int32_t>(H264ScanType::Progressive));
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_SCANTYPE), false);
- obs_data_set_int(data, AMF_H264_MOTIONESTIMATION, 3);
- obs_property_set_enabled(obs_properties_get(props, AMF_H264_MOTIONESTIMATION), false);
+ obs_data_set_int(data, P_MOTIONESTIMATION, 3);
+ obs_property_set_enabled(obs_properties_get(props, P_MOTIONESTIMATION), false);
break;
#pragma endregion YouTube
}
#pragma endregion Presets
- #pragma region Video API
- const char *lastVideoAPI = obs_data_get_string(data, "last" vstr(AMF_H264_VIDEOAPI)),
- *curVideoAPI = obs_data_get_string(data, AMF_H264_VIDEOAPI);
- if (strcmp(curVideoAPI, "") == 0) {
- p = obs_properties_get(props, AMF_H264_VIDEOAPI);
- //fill_api_list(p);
-
- obs_data_set_string(data, AMF_H264_VIDEOAPI,
- obs_property_list_item_string(p, 0));
- curVideoAPI = obs_data_get_string(data, AMF_H264_VIDEOAPI);
- }
- if ((strcmp(lastVideoAPI, curVideoAPI) != 0)
- || (strcmp(curVideoAPI, "") == 0)) {
- obs_data_set_string(data, "last" vstr(AMF_H264_VIDEOAPI), curVideoAPI);
- fill_device_list(obs_properties_get(props, AMF_H264_VIDEOADAPTER), curVideoAPI);
-
- // Reset Video Adapter to first in list.
- obs_data_set_int(data, AMF_H264_VIDEOADAPTER,
- obs_property_list_item_int(obs_properties_get(props, AMF_H264_VIDEOADAPTER), 0));
- }
- #pragma endregion Video API
-
- #pragma region Video Adapter & Capabilities
- VCEDeviceCapabilities devCaps;
- int64_t lastAdapterId = obs_data_get_int(data, "last" vstr(AMF_H264_VIDEOADAPTER)),
- curAdapterId = obs_data_get_int(data, AMF_H264_VIDEOADAPTER);
- {
- auto api = Plugin::API::Base::GetAPIByName(obs_data_get_string(data, AMF_H264_VIDEOAPI));
- auto adapter = api->GetAdapterById(curAdapterId & UINT_MAX, (curAdapterId >> 32) & UINT_MAX);
- devCaps = Plugin::AMD::VCECapabilities::GetInstance()->GetAdapterCapabilities(api, adapter, H264EncoderType::AVC);
- }
- if (lastAdapterId != curAdapterId) {
- obs_data_set_int(data, "last" vstr(AMF_H264_VIDEOADAPTER), curAdapterId);
-
- #pragma region Profile
- p = obs_properties_get(props, AMF_H264_PROFILE);
- obs_property_list_clear(p);
- switch (devCaps.maxProfile) {
- case 100:
- obs_property_list_add_int(p, "High", (int32_t)H264Profile::High);
- obs_property_list_add_int(p, "Constrained High", (int32_t)H264Profile::ConstrainedHigh);
- case 77:
- obs_property_list_add_int(p, "Main", (int32_t)H264Profile::Main);
- case 66:
- obs_property_list_add_int(p, "Baseline", (int32_t)H264Profile::Baseline);
- obs_property_list_add_int(p, "Constrained Baseline", (int32_t)H264Profile::ConstrainedBaseline);
- break;
- }
- #pragma endregion Profile
-
- #pragma region Profile Level
- p = obs_properties_get(props, AMF_H264_PROFILELEVEL);
- obs_property_list_clear(p);
- obs_property_list_add_int(p, TEXT_T(AMF_UTIL_AUTOMATIC), (int32_t)H264ProfileLevel::Automatic);
- switch (devCaps.maxProfileLevel) {
- case 52:
- obs_property_list_add_int(p, "5.2", (int32_t)H264ProfileLevel::L52);
- case 51:
- obs_property_list_add_int(p, "5.1", (int32_t)H264ProfileLevel::L51);
- case 50:
- obs_property_list_add_int(p, "5.0", (int32_t)H264ProfileLevel::L50);
- case 42: // Some VCE 2.0 Cards.
- obs_property_list_add_int(p, "4.2", (int32_t)H264ProfileLevel::L42);
- case 41: // Some APUs and VCE 1.0 Cards.
- obs_property_list_add_int(p, "4.1", (int32_t)H264ProfileLevel::L41);
- case 40: // These should in theory be supported by all VCE 1.0 devices and APUs.
- obs_property_list_add_int(p, "4.0", (int32_t)H264ProfileLevel::L40);
- case 32:
- obs_property_list_add_int(p, "3.2", (int32_t)H264ProfileLevel::L32);
- case 31:
- obs_property_list_add_int(p, "3.1", (int32_t)H264ProfileLevel::L31);
- case 30:
- obs_property_list_add_int(p, "3.0", (int32_t)H264ProfileLevel::L30);
- case 22:
- obs_property_list_add_int(p, "2.2", (int32_t)H264ProfileLevel::L22);
- case 21:
- obs_property_list_add_int(p, "2.1", (int32_t)H264ProfileLevel::L21);
- case 20:
- obs_property_list_add_int(p, "2.0", (int32_t)H264ProfileLevel::L20);
- case 13:
- obs_property_list_add_int(p, "1.3", (int32_t)H264ProfileLevel::L13);
- case 12:
- obs_property_list_add_int(p, "1.2", (int32_t)H264ProfileLevel::L12);
- case 11:
- obs_property_list_add_int(p, "1.1", (int32_t)H264ProfileLevel::L11);
- case 10:
- default:
- obs_property_list_add_int(p, "1.0", (int32_t)H264ProfileLevel::L10);
- }
- #pragma endregion Profile Level
-
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_BITRATE_TARGET),
- 10, devCaps.maxBitrate / 1000, 1);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_BITRATE_PEAK),
- 10, devCaps.maxBitrate / 1000, 1);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_VBVBUFFER_SIZE),
- 1, 100000, 1);
- obs_property_float_set_limits(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL),
- 1.0 / 144.0, 30, 1.0 / 144.0);
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_IDR_PERIOD),
- 1, 1000, 1);
-
- // Experimental
- obs_property_int_set_limits(obs_properties_get(props, AMF_H264_MAXIMUMREFERENCEFRAMES),
- devCaps.minReferenceFrames, devCaps.maxReferenceFrames, 1);
- }
- #pragma endregion Video Adapter
-
#pragma region View Mode
- ViewMode lastView = static_cast<ViewMode>(obs_data_get_int(data, "last" vstr(AMF_H264_VIEW))),
- curView = static_cast<ViewMode>(obs_data_get_int(data, AMF_H264_VIEW));
+ ViewMode lastView = static_cast<ViewMode>(obs_data_get_int(data, ("last" P_VIEW))),
+ curView = static_cast<ViewMode>(obs_data_get_int(data, P_VIEW));
if (lastView != curView) {
- obs_data_set_int(data, "last" vstr(AMF_H264_VIEW), static_cast<int32_t>(curView));
+ obs_data_set_int(data, ("last" P_VIEW), static_cast<int32_t>(curView));
result = true;
}
- bool vis_basic = curView >= ViewMode::Basic,
- vis_advanced = curView >= ViewMode::Advanced,
- vis_expert = curView >= ViewMode::Expert,
- vis_master = curView >= ViewMode::Master;
-
- #pragma region Basic
- const char* basicProps[] = {
- AMF_H264_PRESET,
- AMF_H264_QUALITY_PRESET,
- AMF_H264_PROFILE,
- AMF_H264_RATECONTROLMETHOD,
- AMF_H264_VIEW,
- AMF_H264_DEBUG,
- };
- for (auto prop : basicProps) {
- obs_property_set_visible(obs_properties_get(props, prop), vis_basic);
- if (!vis_basic)
- obs_data_default_single(props, data, prop);
- }
- #pragma endregion Basic
-
- #pragma region Advanced
- const char* advancedProps[] = {
- AMF_H264_VBVBUFFER,
- AMF_H264_FRAMESKIPPING,
- AMF_H264_ENFORCEHRDCOMPATIBILITY,
- AMF_H264_DEBLOCKINGFILTER,
- AMF_H264_VIDEOAPI,
- };
- for (auto prop : advancedProps) {
- obs_property_set_visible(obs_properties_get(props, prop), vis_advanced);
- if (!vis_advanced)
- obs_data_default_single(props, data, prop);
- }
- #pragma endregion Advanced
-
- #pragma region Expert
- const char* expertProps[] = {
- AMF_H264_PROFILELEVEL,
- AMF_H264_VBVBUFFER_FULLNESS,
- AMF_H264_MOTIONESTIMATION,
+ std::vector<std::pair<const char*, ViewMode>> viewstuff = {
+ std::make_pair(P_PRESET, ViewMode::Basic),
+ // ----------- Static Section
+ //std::make_pair(P_USAGE, ViewMode::Master),
+ std::make_pair(P_QUALITYPRESET, ViewMode::Basic),
+ std::make_pair(P_PROFILE, ViewMode::Advanced),
+ std::make_pair(P_PROFILELEVEL, ViewMode::Advanced),
+ std::make_pair(P_ASPECTRATIO, ViewMode::Master),
+ std::make_pair(P_CODINGTYPE, ViewMode::Expert),
+ std::make_pair(P_MAXIMUMREFERENCEFRAMES, ViewMode::Expert),
+ // ----------- Rate Control Section
+ std::make_pair(P_RATECONTROLMETHOD, ViewMode::Basic),
+ std::make_pair(P_PREPASSMODE, ViewMode::Basic),
+ //std::make_pair(P_BITRATE_TARGET, ViewMode::Basic),
+ //std::make_pair(P_BITRATE_PEAK, ViewMode::Basic),
+ //std::make_pair(P_QP_IFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_PFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_BFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_MINIMUM, ViewMode::Advanced),
+ //std::make_pair(P_QP_MAXIMUM, ViewMode::Advanced),
+ //std::make_pair(P_FILLERDATA, ViewMode::Basic),
+ std::make_pair(P_FRAMESKIPPING, ViewMode::Advanced),
+ std::make_pair(P_FRAMESKIPPING_PERIOD, ViewMode::Master),
+ std::make_pair(P_FRAMESKIPPING_BEHAVIOUR, ViewMode::Master),
+ std::make_pair(P_VBAQ, ViewMode::Expert),
+ std::make_pair(P_ENFORCEHRD, ViewMode::Expert),
+ // ----------- VBV Buffer
+ std::make_pair(P_VBVBUFFER, ViewMode::Advanced),
+ //std::make_pair(P_VBVBUFFER_STRICTNESS, ViewMode::Advanced),
+ //std::make_pair(P_VBVBUFFER_SIZE, ViewMode::Advanced),
+ std::make_pair(P_VBVBUFFER_INITIALFULLNESS, ViewMode::Expert),
+ // ----------- Picture Control
+ std::make_pair(P_INTERVAL_KEYFRAME, ViewMode::Basic),
+ std::make_pair(P_PERIOD_IDR_H264, ViewMode::Master),
+ std::make_pair(P_INTERVAL_IFRAME, ViewMode::Master),
+ std::make_pair(P_PERIOD_IFRAME, ViewMode::Master),
+ std::make_pair(P_INTERVAL_PFRAME, ViewMode::Master),
+ std::make_pair(P_PERIOD_PFRAME, ViewMode::Master),
+ //std::make_pair(P_INTERVAL_BFRAME, ViewMode::Master),
+ //std::make_pair(P_PERIOD_BFRAME, ViewMode::Master),
+ std::make_pair(P_BFRAME_PATTERN, ViewMode::Advanced),
+ std::make_pair(P_BFRAME_DELTAQP, ViewMode::Advanced),
+ std::make_pair(P_BFRAME_REFERENCE, ViewMode::Advanced),
+ std::make_pair(P_BFRAME_REFERENCEDELTAQP, ViewMode::Advanced),
+ std::make_pair(P_DEBLOCKINGFILTER, ViewMode::Expert),
+ std::make_pair(P_MOTIONESTIMATION, ViewMode::Expert),
+ // ----------- Intra-Refresh
+ //std::make_pair("", ViewMode::Master),
+ // ----------- System
+ std::make_pair(P_VIDEO_API, ViewMode::Advanced),
+ std::make_pair(P_VIDEO_ADAPTER, ViewMode::Advanced),
+ std::make_pair(P_OPENCL_TRANSFER, ViewMode::Advanced),
+ std::make_pair(P_OPENCL_CONVERSION, ViewMode::Advanced),
+ std::make_pair(P_ASYNCHRONOUSQUEUE, ViewMode::Expert),
+ std::make_pair(P_ASYNCHRONOUSQUEUE_SIZE, ViewMode::Expert),
+ std::make_pair(P_VIEW, ViewMode::Basic),
+ std::make_pair(P_DEBUG, ViewMode::Basic),
};
- for (auto prop : expertProps) {
- obs_property_set_visible(obs_properties_get(props, prop), vis_expert);
- if (!vis_expert)
- obs_data_default_single(props, data, prop);
- }
- #pragma endregion Expert
-
- #pragma region Master
- const char* masterProps[] = {
- AMF_H264_USAGE,
- AMF_H264_MAXIMUMACCESSUNITSIZE,
- AMF_H264_IDR_PERIOD,
- AMF_H264_HEADER_INSERTION_SPACING,
- AMF_H264_SCANTYPE,
- AMF_H264_MAXIMUMLTRFRAMES,
- AMF_H264_CODINGTYPE,
- AMF_H264_SLICESPERFRAME,
- AMF_H264_SLICEMODE,
- AMF_H264_MAXIMUMSLICESIZE,
- AMF_H264_SLICECONTROLMODE,
- AMF_H264_SLICECONTROLSIZE,
- AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES,
- AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT,
- AMF_H264_WAITFORTASK,
- AMF_H264_PREANALYSISPASS,
- AMF_H264_VBAQ,
- AMF_H264_GOPSIZE,
- AMF_H264_GOPALIGNMENT,
- AMF_H264_MAXIMUMREFERENCEFRAMES,
- };
- for (auto prop : masterProps) {
- obs_property_set_visible(obs_properties_get(props, prop), vis_master);
- if (!vis_master)
- obs_data_default_single(props, data, prop);
+ for (std::pair<const char*, ViewMode> kv : viewstuff) {
+ bool vis = curView >= kv.second;
+ auto visp = obs_properties_get(props, kv.first);
+ if (visp != nullptr) {
+ obs_property_set_visible(visp, vis);
+ if (!vis)
+ obs_data_default_single(props, data, kv.first);
+ }
}
- #pragma endregion Master
-
- #pragma region Special Logic
- uint32_t ltrFrames = static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMLTRFRAMES));
- bool usingLTRFrames = ltrFrames > 0;
-
- // Key-frame Interval
- obs_property_set_visible(obs_properties_get(props, AMF_H264_KEYFRAME_INTERVAL), !vis_master);
- if (vis_master)
- obs_data_default_single(props, data, AMF_H264_KEYFRAME_INTERVAL);
+ // Special Logic
#pragma region B-Frames
+ auto bframeProperty = obs_properties_get(props, P_BFRAME_PATTERN);
+ bool bframeSupported = obs_property_int_max(bframeProperty) > 0;
+ bool bframeVisible = (curView >= ViewMode::Advanced) && bframeSupported;
+
/// Pattern
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BFRAME_PATTERN), vis_advanced && !usingLTRFrames && devCaps.supportsBFrames);
- if (!vis_advanced || usingLTRFrames || !devCaps.supportsBFrames)
- obs_data_default_single(props, data, AMF_H264_BFRAME_PATTERN);
- bool lastUsingBFrames = obs_data_get_int(data, "last" vstr(AMF_H264_BFRAME_PATTERN)) != 0,
- usingBFrames = obs_data_get_int(data, AMF_H264_BFRAME_PATTERN) != 0;
+ obs_property_set_visible(bframeProperty, (curView >= ViewMode::Advanced) && bframeSupported);
+ if (!bframeVisible)
+ obs_data_default_single(props, data, P_BFRAME_PATTERN);
+ bool lastUsingBFrames = obs_data_get_int(data, ("last" P_BFRAME_PATTERN)) != 0,
+ usingBFrames = obs_data_get_int(data, P_BFRAME_PATTERN) != 0;
if (usingBFrames != lastUsingBFrames) {
- obs_data_set_int(data, "last" vstr(AMF_H264_BFRAME_PATTERN), obs_data_get_int(data, AMF_H264_BFRAME_PATTERN));
+ obs_data_set_int(data, ("last" P_BFRAME_PATTERN), obs_data_get_int(data, P_BFRAME_PATTERN));
result = true;
}
/// Reference
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BFRAME_REFERENCE), vis_advanced && !usingLTRFrames && usingBFrames && devCaps.supportsBFrames);
- if (!vis_advanced || usingLTRFrames || !usingBFrames || !devCaps.supportsBFrames)
- obs_data_default_single(props, data, AMF_H264_BFRAME_REFERENCE);
- bool lastUsingBFrameReference = obs_data_get_int(data, "last" vstr(AMF_H264_BFRAME_REFERENCE)) != 0,
- usingBFrameReference = obs_data_get_int(data, AMF_H264_BFRAME_REFERENCE) == 1;
+ bool bframeReferenceVisible = (curView >= ViewMode::Advanced) && bframeSupported && usingBFrames;
+ obs_property_set_visible(obs_properties_get(props, P_BFRAME_REFERENCE), bframeReferenceVisible);
+ if (!bframeReferenceVisible)
+ obs_data_default_single(props, data, P_BFRAME_REFERENCE);
+ bool lastUsingBFrameReference = obs_data_get_int(data, ("last" P_BFRAME_REFERENCE)) != 0,
+ usingBFrameReference = obs_data_get_int(data, P_BFRAME_REFERENCE) == 1;
if (usingBFrameReference != lastUsingBFrameReference) {
- obs_data_set_int(data, "last" vstr(AMF_H264_BFRAME_REFERENCE), obs_data_get_int(data, AMF_H264_BFRAME_REFERENCE));
+ obs_data_set_int(data, ("last" P_BFRAME_REFERENCE), obs_data_get_int(data, P_BFRAME_REFERENCE));
result = true;
}
/// QP Delta
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BFRAME_DELTAQP), vis_advanced && usingBFrames && devCaps.supportsBFrames);
- if (!vis_advanced || !usingBFrames || !devCaps.supportsBFrames)
- obs_data_default_single(props, data, AMF_H264_BFRAME_DELTAQP);
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BFRAME_REFERENCEDELTAQP), vis_advanced && usingBFrames && usingBFrameReference && devCaps.supportsBFrames);
- if (!vis_advanced || !usingBFrames || !usingBFrameReference || !devCaps.supportsBFrames)
- obs_data_default_single(props, data, AMF_H264_BFRAME_REFERENCEDELTAQP);
+ obs_property_set_visible(obs_properties_get(props, P_BFRAME_DELTAQP), bframeVisible && usingBFrames);
+ if (!bframeVisible || !usingBFrames)
+ obs_data_default_single(props, data, P_BFRAME_DELTAQP);
+ obs_property_set_visible(obs_properties_get(props, P_BFRAME_REFERENCEDELTAQP), bframeVisible && usingBFrames && usingBFrameReference);
+ if (!bframeVisible || !usingBFrames || !usingBFrameReference)
+ obs_data_default_single(props, data, P_BFRAME_REFERENCEDELTAQP);
#pragma endregion B-Frames
#pragma region Rate Control
vis_rcm_qp_b = false,
vis_rcm_fillerdata = false;
- H264RateControlMethod lastRCM = static_cast<H264RateControlMethod>(obs_data_get_int(data, "last" vstr(AMF_H264_RATECONTROLMETHOD))),
- curRCM = static_cast<H264RateControlMethod>(obs_data_get_int(data, AMF_H264_RATECONTROLMETHOD));
+ RateControlMethod lastRCM = static_cast<RateControlMethod>(obs_data_get_int(data, ("last" P_RATECONTROLMETHOD))),
+ curRCM = static_cast<RateControlMethod>(obs_data_get_int(data, P_RATECONTROLMETHOD));
if (lastRCM != curRCM) {
- obs_data_set_int(data, "last" vstr(AMF_H264_RATECONTROLMETHOD), static_cast<int32_t>(curRCM));
+ obs_data_set_int(data, ("last" P_RATECONTROLMETHOD), static_cast<int32_t>(curRCM));
result = true;
}
switch (curRCM) {
- case H264RateControlMethod::ConstantBitrate:
+ case RateControlMethod::ConstantQP:
+ vis_rcm_qp = true;
+ vis_rcm_qp_b = bframeSupported && usingBFrames;
+ break;
+ case RateControlMethod::ConstantBitrate:
vis_rcm_bitrate_target = true;
vis_rcm_fillerdata = true;
break;
- case H264RateControlMethod::VariableBitrate_PeakConstrained:
+ case RateControlMethod::PeakConstrainedVariableBitrate:
vis_rcm_bitrate_target = true;
vis_rcm_bitrate_peak = true;
break;
- case H264RateControlMethod::VariableBitrate_LatencyConstrained:
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
vis_rcm_bitrate_target = true;
vis_rcm_bitrate_peak = true;
break;
- case H264RateControlMethod::ConstantQP:
- vis_rcm_qp = true;
- vis_rcm_qp_b = (!usingLTRFrames) && devCaps.supportsBFrames && usingBFrames;
- break;
}
/// Bitrate
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BITRATE_TARGET), vis_rcm_bitrate_target);
+ obs_property_set_visible(obs_properties_get(props, P_BITRATE_TARGET), vis_rcm_bitrate_target);
if (!vis_rcm_bitrate_target)
- obs_data_default_single(props, data, AMF_H264_BITRATE_TARGET);
- obs_property_set_visible(obs_properties_get(props, AMF_H264_BITRATE_PEAK), vis_rcm_bitrate_peak);
+ obs_data_default_single(props, data, P_BITRATE_TARGET);
+ obs_property_set_visible(obs_properties_get(props, P_BITRATE_PEAK), vis_rcm_bitrate_peak);
if (!vis_rcm_bitrate_peak)
- obs_data_default_single(props, data, AMF_H264_BITRATE_PEAK);
+ obs_data_default_single(props, data, P_BITRATE_PEAK);
/// QP
- obs_property_set_visible(obs_properties_get(props, AMF_H264_QP_IFRAME), vis_rcm_qp);
- obs_property_set_visible(obs_properties_get(props, AMF_H264_QP_PFRAME), vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_IFRAME), vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_PFRAME), vis_rcm_qp);
if (!vis_rcm_qp) {
- obs_data_default_single(props, data, AMF_H264_QP_IFRAME);
- obs_data_default_single(props, data, AMF_H264_QP_PFRAME);
+ obs_data_default_single(props, data, P_QP_IFRAME);
+ obs_data_default_single(props, data, P_QP_PFRAME);
}
- obs_property_set_visible(obs_properties_get(props, AMF_H264_QP_BFRAME), vis_rcm_qp_b);
+ obs_property_set_visible(obs_properties_get(props, P_QP_BFRAME), vis_rcm_qp_b);
if (!vis_rcm_qp_b)
- obs_data_default_single(props, data, AMF_H264_QP_BFRAME);
+ obs_data_default_single(props, data, P_QP_BFRAME);
/// QP Min/Max
- obs_property_set_visible(obs_properties_get(props, AMF_H264_QP_MINIMUM), vis_advanced && !vis_rcm_qp);
- obs_property_set_visible(obs_properties_get(props, AMF_H264_QP_MAXIMUM), vis_advanced && !vis_rcm_qp);
- if (!vis_advanced || vis_rcm_qp) {
- obs_data_default_single(props, data, AMF_H264_QP_MINIMUM);
- obs_data_default_single(props, data, AMF_H264_QP_MAXIMUM);
+ obs_property_set_visible(obs_properties_get(props, P_QP_MINIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_MAXIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ if (!(curView >= ViewMode::Advanced) || vis_rcm_qp) {
+ obs_data_default_single(props, data, P_QP_MINIMUM);
+ obs_data_default_single(props, data, P_QP_MAXIMUM);
}
/// Filler Data (CBR only at the moment)
- obs_property_set_visible(obs_properties_get(props, AMF_H264_FILLERDATA), vis_rcm_fillerdata);
+ obs_property_set_visible(obs_properties_get(props, P_FILLERDATA), vis_rcm_fillerdata);
if (!vis_rcm_fillerdata)
- obs_data_default_single(props, data, AMF_H264_FILLERDATA);
+ obs_data_default_single(props, data, P_FILLERDATA);
#pragma endregion Rate Control
#pragma region VBV Buffer
- uint32_t vbvBufferMode = static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_VBVBUFFER));
- bool vbvBufferVisible = vis_advanced;
+ uint32_t vbvBufferMode = static_cast<uint32_t>(obs_data_get_int(data, P_VBVBUFFER));
+ bool vbvBufferVisible = (curView >= ViewMode::Advanced);
- uint32_t lastVBVBufferMode = static_cast<uint32_t>(obs_data_get_int(data, "last" vstr(AMF_H264_VBVBUFFER)));
+ uint32_t lastVBVBufferMode = static_cast<uint32_t>(obs_data_get_int(data, ("last" P_VBVBUFFER)));
if (lastVBVBufferMode != vbvBufferMode) {
- obs_data_set_int(data, "last" vstr(AMF_H264_VBVBUFFER), vbvBufferMode);
+ obs_data_set_int(data, ("last" P_VBVBUFFER), vbvBufferMode);
result = true;
}
- obs_property_set_visible(obs_properties_get(props, AMF_H264_VBVBUFFER_STRICTNESS), vbvBufferVisible && (vbvBufferMode == 0));
- obs_property_set_visible(obs_properties_get(props, AMF_H264_VBVBUFFER_SIZE), vbvBufferVisible && (vbvBufferMode == 1));
+ obs_property_set_visible(obs_properties_get(props, P_VBVBUFFER_STRICTNESS), vbvBufferVisible && (vbvBufferMode == 0));
+ obs_property_set_visible(obs_properties_get(props, P_VBVBUFFER_SIZE), vbvBufferVisible && (vbvBufferMode == 1));
if (!vbvBufferVisible || vbvBufferMode == 0)
- obs_data_default_single(props, data, AMF_H264_VBVBUFFER_SIZE);
+ obs_data_default_single(props, data, P_VBVBUFFER_SIZE);
if (!vbvBufferVisible || vbvBufferMode == 1)
- obs_data_default_single(props, data, AMF_H264_VBVBUFFER_STRICTNESS);
+ obs_data_default_single(props, data, P_VBVBUFFER_STRICTNESS);
#pragma endregion VBV Buffer
- bool isnothostmode = strcmp(obs_data_get_string(data, AMF_H264_VIDEOAPI), "Host") != 0;
- /// Video Adapter
- obs_property_set_visible(obs_properties_get(props, AMF_H264_VIDEOADAPTER), vis_advanced && isnothostmode);
- if (!vis_advanced || !isnothostmode)
- obs_data_default_single(props, data, AMF_H264_VIDEOADAPTER);
- /// OpenCL
- obs_property_set_visible(obs_properties_get(props, AMF_H264_OPENCL), vis_advanced && isnothostmode);
- if (!vis_advanced || !isnothostmode)
- obs_data_default_single(props, data, AMF_H264_OPENCL);
-
- #pragma endregion Special Logic
+ #pragma region B-Frame Interval
+ bool bframeIntervalVisible = bframeSupported && (curView >= ViewMode::Master);
+ obs_property_set_visible(obs_properties_get(props, P_PERIOD_BFRAME), bframeIntervalVisible);
+ obs_property_set_visible(obs_properties_get(props, P_INTERVAL_BFRAME), bframeIntervalVisible);
+ if (!bframeIntervalVisible) {
+ obs_data_default_single(props, data, P_PERIOD_BFRAME);
+ obs_data_default_single(props, data, P_INTERVAL_BFRAME);
+ }
+ #pragma endregion B-Frame Interval
#pragma endregion View Mode
+ // Permanently disable static properties while encoding.
+ void* enc = obs_properties_get_param(props);
+ if (enc) {
+ std::vector<const char*> hiddenProperties = {
+ // Static
+ ///P_USAGE,
+ P_QUALITYPRESET,
+ P_PROFILE,
+ P_PROFILELEVEL,
+ P_CODINGTYPE,
+ P_MAXIMUMREFERENCEFRAMES,
+
+ P_BFRAME_PATTERN,
+ P_BFRAME_REFERENCE,
+
+ // Dynamic
+ /// Rate Control
+ //P_RATECONTROLMETHOD,
+ //P_PREPASSMODE,
+ //P_FRAMESKIPPING,
+ //P_FRAMESKIPPING_PERIOD,
+ //P_FRAMESKIPPING_BEHAVIOUR,
+ //P_DEBLOCKINGFILTER,
+
+ //P_VBVBUFFER,
+ //P_VBVBUFFER_STRICTNESS,
+ //P_VBVBUFFER_SIZE,
+ //P_VBVBUFFER_INITIALFULLNESS,
+ //P_VBAQ,
+
+ //// Picture Control
+ //P_INTERVAL_KEYFRAME,
+ //P_PERIOD_IDR_H264,
+ //P_INTERVAL_IFRAME,
+ //P_PERIOD_IFRAME,
+ //P_INTERVAL_PFRAME,
+ //P_PERIOD_PFRAME,
+ //P_INTERVAL_BFRAME,
+ //P_PERIOD_BFRAME,
+
+ //P_BFRAME_PATTERN,
+ //P_BFRAME_REFERENCE,
+
+ // System
+ P_VIDEO_API,
+ P_VIDEO_ADAPTER,
+ P_OPENCL_TRANSFER,
+ P_OPENCL_CONVERSION,
+ P_ASYNCHRONOUSQUEUE,
+ P_ASYNCHRONOUSQUEUE_SIZE,
+ P_DEBUG,
+ };
+ for (const char* pr : hiddenProperties) {
+ obs_property_set_enabled(obs_properties_get(props, pr), false);
+ }
+ }
+
return result;
}
bool Plugin::Interface::H264Interface::update(void *data, obs_data_t *settings) {
- try {
+ if (data)
return static_cast<Plugin::Interface::H264Interface*>(data)->update(settings);
- } catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- }
return false;
}
void Plugin::Interface::H264Interface::get_video_info(void *data, struct video_scale_info *info) {
- try {
+ if (data)
return static_cast<Plugin::Interface::H264Interface*>(data)->get_video_info(info);
- } catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- return;
- }
}
bool Plugin::Interface::H264Interface::get_extra_data(void *data, uint8_t** extra_data, size_t* size) {
- try {
+ if (data)
return static_cast<Plugin::Interface::H264Interface*>(data)->get_extra_data(extra_data, size);
- } catch (std::exception e) {
- AMF_LOG_ERROR("%s", e.what());
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- }
return false;
}
// Module Code
//////////////////////////////////////////////////////////////////////////
Plugin::Interface::H264Interface::H264Interface(obs_data_t* data, obs_encoder_t* encoder) {
- AMF_LOG_DEBUG("<H264Interface::H264Interface> Initializing...");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Initializing...");
- // OBS Settings
- uint32_t m_cfgWidth = obs_encoder_get_width(encoder);
- uint32_t m_cfgHeight = obs_encoder_get_height(encoder);
- video_t *video = obs_encoder_video(encoder);
- const struct video_output_info *voi = video_output_get_info(video);
- uint32_t m_cfgFPSnum = voi->fps_num;
- uint32_t m_cfgFPSden = voi->fps_den;
+ m_Encoder = encoder;
- obs_data_transfer_settings(data);
+ // OBS Settings
+ uint32_t obsWidth = obs_encoder_get_width(encoder);
+ uint32_t obsHeight = obs_encoder_get_height(encoder);
+ video_t *obsVideoInfo = obs_encoder_video(encoder);
+ const struct video_output_info *voi = video_output_get_info(obsVideoInfo);
+ uint32_t obsFPSnum = voi->fps_num;
+ uint32_t obsFPSden = voi->fps_den;
//////////////////////////////////////////////////////////////////////////
/// Initialize Encoder
- bool debug = obs_data_get_bool(data, AMF_H264_DEBUG);
- Plugin::AMD::AMF::GetInstance()->EnableDebugTrace(debug);
+ bool debug = obs_data_get_bool(data, P_DEBUG);
+ Plugin::AMD::AMF::Instance()->EnableDebugTrace(debug);
- H264ColorFormat surfFormat = H264ColorFormat::NV12;
+ ColorFormat colorFormat = ColorFormat::NV12;
switch (voi->format) {
case VIDEO_FORMAT_NV12:
- surfFormat = H264ColorFormat::NV12;
+ colorFormat = ColorFormat::NV12;
break;
case VIDEO_FORMAT_I420:
- surfFormat = H264ColorFormat::I420;
+ colorFormat = ColorFormat::I420;
break;
case VIDEO_FORMAT_YUY2:
- surfFormat = H264ColorFormat::YUY2;
+ colorFormat = ColorFormat::YUY2;
break;
case VIDEO_FORMAT_RGBA:
- surfFormat = H264ColorFormat::RGBA;
+ colorFormat = ColorFormat::RGBA;
break;
case VIDEO_FORMAT_BGRA:
- surfFormat = H264ColorFormat::BGRA;
+ colorFormat = ColorFormat::BGRA;
break;
case VIDEO_FORMAT_Y800:
- surfFormat = H264ColorFormat::GRAY;
+ colorFormat = ColorFormat::GRAY;
break;
}
- m_VideoEncoder = new H264Encoder(H264EncoderType::AVC, obs_data_get_string(data, AMF_H264_VIDEOAPI),
- obs_data_get_int(data, AMF_H264_VIDEOADAPTER), !!obs_data_get_int(data, AMF_H264_OPENCL), surfFormat);
+ ColorSpace colorSpace = ColorSpace::BT601;
+ switch (voi->colorspace) {
+ case VIDEO_CS_601:
+ colorSpace = ColorSpace::BT601;
+ break;
+ case VIDEO_CS_DEFAULT:
+ case VIDEO_CS_709:
+ colorSpace = ColorSpace::BT709;
+ break;
+ }
+
+ auto api = API::GetAPI(obs_data_get_string(data, P_VIDEO_API));
+ union {
+ int64_t v;
+ uint32_t id[2];
+ } adapterid = { obs_data_get_int(data, P_VIDEO_ADAPTER) };
+ auto adapter = api->GetAdapterById(adapterid.id[0], adapterid.id[1]);
+
+ m_VideoEncoder = std::make_unique<EncoderH264>(api, adapter,
+ !!obs_data_get_int(data, P_OPENCL_TRANSFER), !!obs_data_get_int(data, P_OPENCL_CONVERSION),
+ colorFormat, colorSpace, voi->range == VIDEO_RANGE_FULL,
+ !!obs_data_get_int(data, P_ASYNCHRONOUSQUEUE), (size_t)obs_data_get_int(data, P_ASYNCHRONOUSQUEUE_SIZE));
- /// Static Properties
- m_VideoEncoder->SetUsage(static_cast<H264Usage>(obs_data_get_int(data, AMF_H264_USAGE)));
- m_VideoEncoder->SetQualityPreset(static_cast<H264QualityPreset>(obs_data_get_int(data, AMF_H264_QUALITY_PRESET)));
+ /// Static Properties
+ m_VideoEncoder->SetUsage(Usage::Transcoding);
+ m_VideoEncoder->SetQualityPreset(static_cast<QualityPreset>(obs_data_get_int(data, P_QUALITYPRESET)));
/// Frame
- m_VideoEncoder->SetColorProfile(voi->colorspace == VIDEO_CS_709 ? H264ColorProfile::Rec709 : H264ColorProfile::Rec601);
- try { m_VideoEncoder->SetFullRangeColorEnabled(voi->range == VIDEO_RANGE_FULL); } catch (...) {}
- m_VideoEncoder->SetResolution(m_cfgWidth, m_cfgHeight);
- m_VideoEncoder->SetFrameRate(m_cfgFPSnum, m_cfgFPSden);
- m_VideoEncoder->SetScanType(static_cast<H264ScanType>(obs_data_get_int(data, AMF_H264_SCANTYPE)));
+ m_VideoEncoder->SetResolution(std::make_pair(obsWidth, obsHeight));
+ m_VideoEncoder->SetFrameRate(std::make_pair(obsFPSnum, obsFPSden));
/// Profile & Level
- m_VideoEncoder->SetProfile(static_cast<H264Profile>(obs_data_get_int(data, AMF_H264_PROFILE)));
- m_VideoEncoder->SetProfileLevel(static_cast<H264ProfileLevel>(obs_data_get_int(data, AMF_H264_PROFILELEVEL)));
+ m_VideoEncoder->SetProfile(static_cast<Profile>(obs_data_get_int(data, P_PROFILE)));
+ m_VideoEncoder->SetProfileLevel(static_cast<ProfileLevel>(obs_data_get_int(data, P_PROFILELEVEL)));
- #pragma region Experimental
- /// Long Term Reference
- if (static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMLTRFRAMES) > 0))
- m_VideoEncoder->SetBFramePattern(H264BFramePattern::None);
- m_VideoEncoder->SetMaximumLongTermReferenceFrames(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMLTRFRAMES)));
-
- #pragma endregion Experimental
+ try {
+ m_VideoEncoder->SetCodingType(static_cast<CodingType>(obs_data_get_int(data, P_CODINGTYPE)));
+ } catch (...) {
+ }
+ try {
+ m_VideoEncoder->SetMaximumReferenceFrames(obs_data_get_int(data, P_MAXIMUMREFERENCEFRAMES));
+ } catch (...) {
+ }
// OBS - Enforce Streaming Service Restrictions
#pragma region OBS - Enforce Streaming Service Restrictions
const char* p_str = obs_data_get_string(data, "profile");
if (strcmp(p_str, "") != 0) {
if (strcmp(p_str, "constrained_baseline")) {
- m_VideoEncoder->SetProfile(H264Profile::ConstrainedBaseline);
+ m_VideoEncoder->SetProfile(Profile::ConstrainedBaseline);
} else if (strcmp(p_str, "baseline")) {
- m_VideoEncoder->SetProfile(H264Profile::Baseline);
+ m_VideoEncoder->SetProfile(Profile::Baseline);
} else if (strcmp(p_str, "main")) {
- m_VideoEncoder->SetProfile(H264Profile::Main);
+ m_VideoEncoder->SetProfile(Profile::Main);
} else if (strcmp(p_str, "constrained_high")) {
- m_VideoEncoder->SetProfile(H264Profile::ConstrainedHigh);
+ m_VideoEncoder->SetProfile(Profile::ConstrainedHigh);
} else if (strcmp(p_str, "high")) {
- m_VideoEncoder->SetProfile(H264Profile::High);
+ m_VideoEncoder->SetProfile(Profile::High);
}
} else {
switch (m_VideoEncoder->GetProfile()) {
- case H264Profile::ConstrainedBaseline:
+ case Profile::ConstrainedBaseline:
obs_data_set_string(data, "profile", "constrained_baseline");
break;
- case H264Profile::Baseline:
+ case Profile::Baseline:
obs_data_set_string(data, "profile", "baseline");
break;
- case H264Profile::Main:
+ case Profile::Main:
obs_data_set_string(data, "profile", "main");
break;
- case H264Profile::ConstrainedHigh:
+ case Profile::ConstrainedHigh:
obs_data_set_string(data, "profile", "constrained_high");
break;
- case H264Profile::High:
+ case Profile::High:
obs_data_set_string(data, "profile", "high");
break;
}
const char* preset = obs_data_get_string(data, "preset");
if (strcmp(preset, "") != 0) {
if (strcmp(preset, "speed") == 0) {
- m_VideoEncoder->SetQualityPreset(H264QualityPreset::Speed);
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Speed);
} else if (strcmp(preset, "balanced") == 0) {
- m_VideoEncoder->SetQualityPreset(H264QualityPreset::Balanced);
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Balanced);
} else if (strcmp(preset, "quality") == 0) {
- m_VideoEncoder->SetQualityPreset(H264QualityPreset::Quality);
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Quality);
}
- obs_data_set_int(data, AMF_H264_QUALITY_PRESET, (int32_t)m_VideoEncoder->GetQualityPreset());
+ obs_data_set_int(data, P_QUALITYPRESET, (int32_t)m_VideoEncoder->GetQualityPreset());
} else {
switch (m_VideoEncoder->GetQualityPreset()) {
- case H264QualityPreset::Speed:
+ case QualityPreset::Speed:
obs_data_set_string(data, "preset", "speed");
break;
- case H264QualityPreset::Balanced:
+ case QualityPreset::Balanced:
obs_data_set_string(data, "preset", "balanced");
break;
- case H264QualityPreset::Quality:
+ case QualityPreset::Quality:
obs_data_set_string(data, "preset", "quality");
break;
}
}
// Dynamic Properties (Can be changed during Encoding)
- this->update(data);
+ //this->update(data);
- AMF_LOG_DEBUG("<H264Interface::H264Interface> Complete.");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Complete.");
}
Plugin::Interface::H264Interface::~H264Interface() {
- AMF_LOG_DEBUG("<H264Interface::~H264Interface> Finalizing...");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Finalizing...");
if (m_VideoEncoder) {
m_VideoEncoder->Stop();
- delete m_VideoEncoder;
+ m_VideoEncoder = nullptr;
}
- AMF_LOG_DEBUG("<H264Interface::~H264Interface> Complete.");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Complete.");
}
bool Plugin::Interface::H264Interface::update(obs_data_t* data) {
- #pragma region Device Capabilities
- auto api = Plugin::API::Base::GetAPIByName(obs_data_get_string(data, AMF_H264_VIDEOAPI));
- int64_t adapterId = obs_data_get_int(data, AMF_H264_VIDEOADAPTER);
- auto adapter = api->GetAdapterById(adapterId & UINT_MAX, (adapterId >> 32) & UINT_MAX);
- auto devCaps = Plugin::AMD::VCECapabilities::GetInstance()->GetAdapterCapabilities(api, adapter, H264EncoderType::AVC);
- #pragma endregion Device Capabilities
+ const video_t *obsVideoInfo = obs_encoder_video(m_Encoder);
+ const struct video_output_info *voi = video_output_get_info(obsVideoInfo);
+ uint32_t obsFPSnum = voi->fps_num;
+ uint32_t obsFPSden = voi->fps_den;
+
+ // Rate Control
+ m_VideoEncoder->SetRateControlMethod(static_cast<RateControlMethod>(obs_data_get_int(data, P_RATECONTROLMETHOD)));
+ m_VideoEncoder->SetPrePassMode(static_cast<PrePassMode>(obs_data_get_int(data, P_PREPASSMODE)));
+ m_VideoEncoder->SetVarianceBasedAdaptiveQuantizationEnabled(!!obs_data_get_int(data, P_VBAQ));
+ m_VideoEncoder->SetFrameSkippingEnabled(!!obs_data_get_int(data, P_FRAMESKIPPING));
+ m_VideoEncoder->SetEnforceHRDEnabled(!!obs_data_get_int(data, P_ENFORCEHRD));
+ m_VideoEncoder->SetFillerDataEnabled(!!obs_data_get_int(data, P_FILLERDATA));
+
+ m_VideoEncoder->SetQPMinimum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_MINIMUM)));
+ m_VideoEncoder->SetQPMaximum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_MAXIMUM)));
+ switch (m_VideoEncoder->GetRateControlMethod()) {
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_PEAK) * 1000));
+ m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ break;
+ case RateControlMethod::ConstantBitrate:
+ m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ break;
+ case RateControlMethod::ConstantQP:
+ m_VideoEncoder->SetIFrameQP(static_cast<uint8_t>(obs_data_get_int(data, P_QP_IFRAME)));
+ m_VideoEncoder->SetPFrameQP(static_cast<uint8_t>(obs_data_get_int(data, P_QP_PFRAME)));
+ try {
+ m_VideoEncoder->SetBFrameQP(static_cast<uint8_t>(obs_data_get_int(data, P_QP_BFRAME)));
+ } catch (...) {
+ }
+ break;
+ }
- #pragma region Rate Control
- // Rate Control Properties
- if (m_VideoEncoder->GetUsage() != H264Usage::UltraLowLatency) {
- m_VideoEncoder->SetRateControlMethod(static_cast<H264RateControlMethod>(obs_data_get_int(data, AMF_H264_RATECONTROLMETHOD)));
- m_VideoEncoder->SetMinimumQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_MINIMUM)));
- m_VideoEncoder->SetMaximumQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_MAXIMUM)));
- switch (static_cast<H264RateControlMethod>(obs_data_get_int(data, AMF_H264_RATECONTROLMETHOD))) {
- case H264RateControlMethod::ConstantBitrate:
- m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_BITRATE_TARGET) * 1000));
- m_VideoEncoder->SetPeakBitrate(m_VideoEncoder->GetTargetBitrate());
- break;
- case H264RateControlMethod::VariableBitrate_PeakConstrained:
- case H264RateControlMethod::VariableBitrate_LatencyConstrained:
- m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_BITRATE_TARGET) * 1000));
- m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_BITRATE_PEAK) * 1000));
- break;
- case H264RateControlMethod::ConstantQP:
- m_VideoEncoder->SetIFrameQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_IFRAME)));
- m_VideoEncoder->SetPFrameQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_PFRAME)));
- if (devCaps.supportsBFrames && m_VideoEncoder->GetUsage() != H264Usage::UltraLowLatency)
- try { m_VideoEncoder->SetBFrameQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_BFRAME))); } catch (...) {}
- break;
- }
- if (obs_data_get_int(data, AMF_H264_VBVBUFFER) == 0) {
- m_VideoEncoder->SetVBVBufferAutomatic(obs_data_get_double(data, AMF_H264_VBVBUFFER_STRICTNESS) / 100.0);
- } else {
- m_VideoEncoder->SetVBVBufferSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_VBVBUFFER_SIZE) * 1000));
- }
- m_VideoEncoder->SetInitialVBVBufferFullness(obs_data_get_double(data, AMF_H264_VBVBUFFER_FULLNESS) / 100.0);
- m_VideoEncoder->SetFillerDataEnabled(!!obs_data_get_int(data, AMF_H264_FILLERDATA));
- m_VideoEncoder->SetFrameSkippingEnabled(!!obs_data_get_int(data, AMF_H264_FRAMESKIPPING));
+ m_VideoEncoder->SetVBVBufferInitialFullness((float)obs_data_get_double(data, P_VBVBUFFER_INITIALFULLNESS) / 100.0f);
+ if (obs_data_get_int(data, P_VBVBUFFER) == 0) {
+ m_VideoEncoder->SetVBVBufferStrictness(obs_data_get_double(data, P_VBVBUFFER_STRICTNESS) / 100.0);
} else {
- m_VideoEncoder->SetMinimumQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_MINIMUM)));
- m_VideoEncoder->SetMaximumQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_MAXIMUM)));
- m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_BITRATE_TARGET) * 1000));
-
- m_VideoEncoder->SetIFrameQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_IFRAME)));
- m_VideoEncoder->SetPFrameQP(static_cast<uint8_t>(obs_data_get_int(data, AMF_H264_QP_PFRAME)));
+ m_VideoEncoder->SetVBVBufferSize(static_cast<uint32_t>(obs_data_get_int(data, P_VBVBUFFER_SIZE) * 1000));
+ }
- if (obs_data_get_int(data, AMF_H264_VBVBUFFER) == 0) {
- m_VideoEncoder->SetVBVBufferSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_BITRATE_TARGET) * 1000));
- } else {
- m_VideoEncoder->SetVBVBufferSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_VBVBUFFER_SIZE) * 1000));
+ // Picture Control
+ double_t framerate = (double_t)obsFPSnum / (double_t)obsFPSden;
+ /// Keyframe Interval/Period
+ {
+ uint32_t idrperiod = static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_IDR_H265));
+ if (idrperiod == 0) {
+ double_t keyinterv = obs_data_get_double(data, P_INTERVAL_KEYFRAME);
+ idrperiod = static_cast<uint32_t>(ceil((keyinterv * framerate)));
}
+ m_VideoEncoder->SetIDRPeriod(idrperiod);
}
- m_VideoEncoder->SetEnforceHRDRestrictionsEnabled(obs_data_get_int(data, AMF_H264_ENFORCEHRDCOMPATIBILITY) == 1);
- #pragma endregion Rate Control
-
- // Key-frame Interval
- double_t framerate = (double_t)m_VideoEncoder->GetFrameRate().first / (double_t)m_VideoEncoder->GetFrameRate().second;
- if (static_cast<ViewMode>(obs_data_get_int(data, AMF_H264_VIEW)) == ViewMode::Master)
- m_VideoEncoder->SetIDRPeriod(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_IDR_PERIOD)));
- else
- m_VideoEncoder->SetIDRPeriod(max(static_cast<uint32_t>(obs_data_get_double(data, AMF_H264_KEYFRAME_INTERVAL) * framerate), 1));
+ /// I/P/Skip Frame Interval/Period
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_IFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_IFRAME)));
+ m_VideoEncoder->SetIFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_PFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_PFRAME)));
+ m_VideoEncoder->SetPFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_BFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_BFRAME)));
+ m_VideoEncoder->SetBFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_int(data, P_FRAMESKIPPING_PERIOD));
+ m_VideoEncoder->SetFrameSkippingPeriod(period);
+ m_VideoEncoder->SetFrameSkippingBehaviour(!!obs_data_get_int(data, P_FRAMESKIPPING_BEHAVIOUR));
+ }
+ m_VideoEncoder->SetDeblockingFilterEnabled(!!obs_data_get_int(data, P_DEBLOCKINGFILTER));
#pragma region B-Frames
- if (devCaps.supportsBFrames) {
+ if (m_VideoEncoder->CapsBFramePattern() > 0) {
try {
- m_VideoEncoder->SetBFramePattern(static_cast<H264BFramePattern>(obs_data_get_int(data, AMF_H264_BFRAME_PATTERN)));
- if (obs_data_get_int(data, AMF_H264_BFRAME_PATTERN) != 0)
- m_VideoEncoder->SetBFrameDeltaQP(static_cast<int8_t>(obs_data_get_int(data, AMF_H264_BFRAME_DELTAQP)));
- } catch (...) {}
-
- // B-Frame Reference can't be used with anything else but Transcoding.
- if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding) {
- try {
- m_VideoEncoder->SetBFrameReferenceEnabled(!!obs_data_get_int(data, AMF_H264_BFRAME_REFERENCE));
- if (!!obs_data_get_int(data, AMF_H264_BFRAME_REFERENCE))
- m_VideoEncoder->SetBFrameReferenceDeltaQP(static_cast<int8_t>(obs_data_get_int(data, AMF_H264_BFRAME_REFERENCEDELTAQP)));
- } catch (...) {}
+ m_VideoEncoder->SetBFramePattern(static_cast<uint8_t>(obs_data_get_int(data, P_BFRAME_PATTERN)));
+ if (obs_data_get_int(data, P_BFRAME_PATTERN) != 0)
+ m_VideoEncoder->SetBFrameDeltaQP(static_cast<int8_t>(obs_data_get_int(data, P_BFRAME_DELTAQP)));
+ m_VideoEncoder->SetBFrameReferenceEnabled(!!obs_data_get_int(data, P_BFRAME_REFERENCE));
+ if (!!obs_data_get_int(data, P_BFRAME_REFERENCE))
+ m_VideoEncoder->SetBFrameReferenceDeltaQP(static_cast<int8_t>(obs_data_get_int(data, P_BFRAME_REFERENCEDELTAQP)));
+ } catch (...) {
}
}
#pragma endregion B-Frames
- if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding)
- m_VideoEncoder->SetDeblockingFilterEnabled(!!obs_data_get_int(data, AMF_H264_DEBLOCKINGFILTER));
-
- #pragma region Motion Estimation
- m_VideoEncoder->SetHalfPixelMotionEstimationEnabled(!!(obs_data_get_int(data, AMF_H264_MOTIONESTIMATION) & 1));
- m_VideoEncoder->SetQuarterPixelMotionEstimationEnabled(!!(obs_data_get_int(data, AMF_H264_MOTIONESTIMATION) & 2));
- #pragma endregion Motion Estimation
-
- #pragma region Experimental
- try { m_VideoEncoder->SetCodingType(static_cast<H264CodingType>(obs_data_get_int(data, AMF_H264_CODINGTYPE))); } catch (...) {}
- try { m_VideoEncoder->SetWaitForTaskEnabled(!!obs_data_get_int(data, AMF_H264_WAITFORTASK)); } catch (...) {}
- if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding || m_VideoEncoder->GetUsage() == H264Usage::Webcam) {
- try { m_VideoEncoder->SetPreAnalysisPassEnabled(!!obs_data_get_int(data, AMF_H264_PREANALYSISPASS)); } catch (...) {}
- try { m_VideoEncoder->SetVBAQEnabled(!!obs_data_get_int(data, AMF_H264_VBAQ)); } catch (...) {}
- }
-
- try { m_VideoEncoder->SetHeaderInsertionSpacing(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_HEADER_INSERTION_SPACING))); } catch (...) {}
- if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding || m_VideoEncoder->GetUsage() == H264Usage::Webcam) {
- try { m_VideoEncoder->SetMaximumAccessUnitSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMACCESSUNITSIZE))); } catch (...) {}
- }
- try { m_VideoEncoder->SetMaximumReferenceFrames(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMREFERENCEFRAMES))); } catch (...) {}
-
- if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding || m_VideoEncoder->GetUsage() == H264Usage::Webcam) {
- try { m_VideoEncoder->SetGOPSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_GOPSIZE))); } catch (...) {}
- }
- try { m_VideoEncoder->SetGOPAlignmentEnabled(!!obs_data_get_int(data, AMF_H264_GOPALIGNMENT)); } catch (...) {}
-
- try { m_VideoEncoder->SetIntraRefreshNumberOfStripes(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_INTRAREFRESH_NUMBEROFSTRIPES))); } catch (...) {}
- try { m_VideoEncoder->SetIntraRefreshMacroblocksPerSlot(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_INTRAREFRESH_MACROBLOCKSPERSLOT))); } catch (...) {}
-
- try { m_VideoEncoder->SetSlicesPerFrame(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_SLICESPERFRAME))); } catch (...) {}
- try { m_VideoEncoder->SetSliceMode(static_cast<H264SliceMode>(obs_data_get_int(data, AMF_H264_SLICEMODE))); } catch (...) {}
- try { m_VideoEncoder->SetMaximumSliceSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_MAXIMUMSLICESIZE))); } catch (...) {}
- try { m_VideoEncoder->SetSliceControlMode(static_cast<H264SliceControlMode>(obs_data_get_int(data, AMF_H264_SLICECONTROLMODE))); } catch (...) {}
- try { m_VideoEncoder->SetSliceControlSize(static_cast<uint32_t>(obs_data_get_int(data, AMF_H264_SLICECONTROLSIZE))); } catch (...) {}
- #pragma endregion Experimental
-
- if (m_VideoEncoder->IsStarted()) {
- // OBS - Enforce Streaming Service Stuff
- #pragma region OBS Enforce Streaming Service Settings
- {
- // Rate Control Method
- const char* t_str = obs_data_get_string(data, "rate_control");
- if (strcmp(t_str, "") != 0) {
- if (strcmp(t_str, "CBR") == 0) {
- m_VideoEncoder->SetRateControlMethod(H264RateControlMethod::ConstantBitrate);
- m_VideoEncoder->SetFillerDataEnabled(true);
- } else if (strcmp(t_str, "VBR") == 0) {
- m_VideoEncoder->SetRateControlMethod(H264RateControlMethod::VariableBitrate_PeakConstrained);
- } else if (strcmp(t_str, "VBR_LAT") == 0) {
- m_VideoEncoder->SetRateControlMethod(H264RateControlMethod::VariableBitrate_LatencyConstrained);
- } else if (strcmp(t_str, "CQP") == 0) {
- m_VideoEncoder->SetRateControlMethod(H264RateControlMethod::ConstantQP);
- }
-
- obs_data_set_int(data, AMF_H264_RATECONTROLMETHOD, (int32_t)m_VideoEncoder->GetRateControlMethod());
- } else {
- if (m_VideoEncoder->GetUsage() != H264Usage::UltraLowLatency)
- switch (m_VideoEncoder->GetRateControlMethod()) {
- case H264RateControlMethod::ConstantBitrate:
- obs_data_set_string(data, "rate_control", "CBR");
- break;
- case H264RateControlMethod::VariableBitrate_PeakConstrained:
- obs_data_set_string(data, "rate_control", "VBR");
- break;
- case H264RateControlMethod::VariableBitrate_LatencyConstrained:
- obs_data_set_string(data, "rate_control", "VBR_LAT");
- break;
- case H264RateControlMethod::ConstantQP:
- obs_data_set_string(data, "rate_control", "CQP");
- break;
- }
+ // Motion Estimation
+ m_VideoEncoder->SetMotionEstimationHalfPixelEnabled(!!(obs_data_get_int(data, P_MOTIONESTIMATION) & 1));
+ m_VideoEncoder->SetMotionEstimationQuarterPixelEnabled(!!(obs_data_get_int(data, P_MOTIONESTIMATION) & 2));
+
+ //#pragma region Experimental
+ //try { m_VideoEncoder->SetWaitForTaskEnabled(!!obs_data_get_int(data, P_WAITFORTASK)); } catch (...) {}
+ //if (m_VideoEncoder->GetUsage() == H264Usage::Transcoding || m_VideoEncoder->GetUsage() == H264Usage::Webcam) {
+ // try { m_VideoEncoder->SetGOPSize(static_cast<uint32_t>(obs_data_get_int(data, P_GOPSIZE))); } catch (...) {}
+ //}
+ //try { m_VideoEncoder->SetGOPAlignmentEnabled(!!obs_data_get_int(data, P_GOPALIGNMENT)); } catch (...) {}
+ //try { m_VideoEncoder->SetIntraRefreshNumberOfStripes(static_cast<uint32_t>(obs_data_get_int(data, P_INTRAREFRESH_NUMBEROFSTRIPES))); } catch (...) {}
+ //try { m_VideoEncoder->SetIntraRefreshMacroblocksPerSlot(static_cast<uint32_t>(obs_data_get_int(data, P_INTRAREFRESH_MACROBLOCKSPERSLOT))); } catch (...) {}
+ //try { m_VideoEncoder->SetSlicesPerFrame(static_cast<uint32_t>(obs_data_get_int(data, P_SLICESPERFRAME))); } catch (...) {}
+ //try { m_VideoEncoder->SetSliceMode(static_cast<H264SliceMode>(obs_data_get_int(data, P_SLICEMODE))); } catch (...) {}
+ //try { m_VideoEncoder->SetMaximumSliceSize(static_cast<uint32_t>(obs_data_get_int(data, P_MAXIMUMSLICESIZE))); } catch (...) {}
+ //try { m_VideoEncoder->SetSliceControlMode(static_cast<H264SliceControlMode>(obs_data_get_int(data, P_SLICECONTROLMODE))); } catch (...) {}
+ //try { m_VideoEncoder->SetSliceControlSize(static_cast<uint32_t>(obs_data_get_int(data, P_SLICECONTROLSIZE))); } catch (...) {}
+ //#pragma endregion Experimental
+
+ #pragma region OBS Enforce Streaming Service Settings
+ {
+ // Rate Control Method
+ const char* t_str = obs_data_get_string(data, "rate_control");
+ if (strcmp(t_str, "") != 0) {
+ if (strcmp(t_str, "CBR") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::ConstantBitrate);
+ m_VideoEncoder->SetFillerDataEnabled(true);
+ } else if (strcmp(t_str, "VBR") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::PeakConstrainedVariableBitrate);
+ } else if (strcmp(t_str, "VBR_LAT") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::LatencyConstrainedVariableBitrate);
+ } else if (strcmp(t_str, "CQP") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::ConstantQP);
}
- // Bitrate
- uint64_t bitrateOvr = obs_data_get_int(data, "bitrate") * 1000;
- if (bitrateOvr != -1) {
- if (m_VideoEncoder->GetTargetBitrate() > bitrateOvr)
- m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(bitrateOvr));
+ obs_data_set_int(data, P_RATECONTROLMETHOD, (int32_t)m_VideoEncoder->GetRateControlMethod());
+ } else {
+ if (m_VideoEncoder->GetUsage() != Usage::UltraLowLatency)
+ switch (m_VideoEncoder->GetRateControlMethod()) {
+ case RateControlMethod::ConstantBitrate:
+ obs_data_set_string(data, "rate_control", "CBR");
+ break;
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ obs_data_set_string(data, "rate_control", "VBR");
+ break;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ obs_data_set_string(data, "rate_control", "VBR_LAT");
+ break;
+ case RateControlMethod::ConstantQP:
+ obs_data_set_string(data, "rate_control", "CQP");
+ break;
+ }
+ }
- if (m_VideoEncoder->GetUsage() != H264Usage::UltraLowLatency)
- if (m_VideoEncoder->GetPeakBitrate() > bitrateOvr)
- m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(bitrateOvr));
+ // Bitrate
+ uint64_t bitrateOvr = obs_data_get_int(data, "bitrate") * 1000;
+ if (bitrateOvr != -1) {
+ if (m_VideoEncoder->GetTargetBitrate() > bitrateOvr)
+ m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(bitrateOvr));
+ if (m_VideoEncoder->GetPeakBitrate() > bitrateOvr)
+ m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(bitrateOvr));
- obs_data_set_int(data, "bitrate", m_VideoEncoder->GetTargetBitrate() / 1000);
+ obs_data_set_int(data, "bitrate", m_VideoEncoder->GetTargetBitrate() / 1000);
- obs_data_set_int(data, AMF_H264_BITRATE_TARGET, m_VideoEncoder->GetTargetBitrate() / 1000);
- if (m_VideoEncoder->GetUsage() != H264Usage::UltraLowLatency)
- obs_data_set_int(data, AMF_H264_BITRATE_PEAK, m_VideoEncoder->GetPeakBitrate() / 1000);
- } else {
- obs_data_set_int(data, "bitrate", m_VideoEncoder->GetTargetBitrate() / 1000);
- }
+ obs_data_set_int(data, P_BITRATE_TARGET, m_VideoEncoder->GetTargetBitrate() / 1000);
+ obs_data_set_int(data, P_BITRATE_PEAK, m_VideoEncoder->GetPeakBitrate() / 1000);
+ } else {
+ obs_data_set_int(data, "bitrate", m_VideoEncoder->GetTargetBitrate() / 1000);
+ }
- // IDR-Period (Keyframes)
- uint32_t fpsNum = m_VideoEncoder->GetFrameRate().first;
- uint32_t fpsDen = m_VideoEncoder->GetFrameRate().second;
- if (obs_data_get_int(data, "keyint_sec") != -1) {
- m_VideoEncoder->SetIDRPeriod(static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
+ // IDR-Period (Keyframes)
+ uint32_t fpsNum = m_VideoEncoder->GetFrameRate().first;
+ uint32_t fpsDen = m_VideoEncoder->GetFrameRate().second;
+ if (obs_data_get_int(data, "keyint_sec") != -1) {
+ m_VideoEncoder->SetIDRPeriod(static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
- obs_data_set_double(data, AMF_H264_KEYFRAME_INTERVAL, static_cast<double_t>(obs_data_get_int(data, "keyint_sec")));
- obs_data_set_int(data, AMF_H264_IDR_PERIOD, static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
- } else {
- obs_data_set_int(data, "keyint_sec", static_cast<uint64_t>(m_VideoEncoder->GetIDRPeriod() / (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
- }
+ obs_data_set_double(data, P_INTERVAL_KEYFRAME, static_cast<double_t>(obs_data_get_int(data, "keyint_sec")));
+ obs_data_set_int(data, P_PERIOD_IDR_H264, static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
+ } else {
+ obs_data_set_int(data, "keyint_sec", static_cast<uint64_t>(m_VideoEncoder->GetIDRPeriod() / (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
}
- #pragma endregion OBS Enforce Streaming Service Settings
+ }
+ #pragma endregion OBS Enforce Streaming Service Settings
- // Verify
+ if (m_VideoEncoder->IsStarted()) {
m_VideoEncoder->LogProperties();
- if (static_cast<ViewMode>(obs_data_get_int(data, AMF_H264_VIEW)) >= ViewMode::Master)
- AMF_LOG_ERROR("View Mode 'Master' is active, avoid giving anything but basic support. Error is most likely caused by user settings themselves.");
+ if (static_cast<ViewMode>(obs_data_get_int(data, P_VIEW)) >= ViewMode::Master)
+ PLOG_ERROR("View Mode 'Master' is active, avoid giving anything but basic support. Error is most likely caused by user settings themselves.");
}
return true;
if (!frame || !packet || !received_packet)
return false;
- bool retVal = true;
+ bool retVal = false;
- retVal = m_VideoEncoder->SendInput(frame);
- retVal = retVal && m_VideoEncoder->GetOutput(packet, received_packet);
+ try {
+ retVal = m_VideoEncoder->Encode(frame, packet, received_packet);
+ } catch (std::exception e) {
+ PLOG_ERROR("Exception during encoding: %s", e.what());
+ } catch (...) {
+ PLOG_ERROR("Unknown exception during encoding.");
+ }
return retVal;
}
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/enc-h265.cpp
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+
+//////////////////////////////////////////////////////////////////////////
+// New UI Design
+//////////////////////////////////////////////////////////////////////////
+// All: Preset
+// ----------- Static Section
+// Mas: Usage
+// All: Quality Preset
+// Adv: Profile
+// Adv: Profile Level
+// Adv: Tier
+// Mas: Aspect Ratio
+// Exp: Coding Type
+// Exp: Maximum Reference Frames
+// ----------- Rate Control Section
+// All: Rate Control Method
+// Adv: Pre-Pass Encoding (if supported)
+// All, CBR&VBR: Target Bitrate
+// All, VBR: Peak Bitrate
+// All, CQP: QP I/P
+// Adv, CBR&VBR: Min/Max I/P-Frame QP
+// CBR: Filler Data
+// Adv: Frame Skipping
+// Exp: VBAQ
+// Exp: Enforce HRD
+// ----------- VBV Buffer
+// Adv: VBV Buffer Size
+// Exp: VBV Buffer Initial Fullness
+// ----------- Picture Control
+// All: Keyframe Interval (Float, uses GOP Size Fixed/Min/Max)
+// Mas: IDR Period (in GOPs)
+// Exp: GOP Type
+// Exp: GOP Size
+// Exp: GOP Size Min/Max
+// Exp: Deblocking Filter
+// Exp: Motion Estimation (Dropdown)
+// ----------- Intra-Refresh
+// ToDo: Master Mode only?
+// ----------- System
+// Adv: API
+// Adv: Adapter
+// Exp: OpenCL
+// All: View
+
+#include "enc-h265.h"
+#include "amf-capabilities.h"
+#include "amf-encoder.h"
+#include "amf-encoder-h265.h"
+#include "strings.h"
+#include "utility.h"
+
+#define PREFIX "[H265/HEVC]"
+
+using namespace Plugin::AMD;
+
+void Plugin::Interface::H265Interface::encoder_register() {
+ // Test if we actually have AVC support.
+ if (!AMD::CapabilityManager::Instance()->IsCodecSupported(Codec::HEVC)) {
+ PLOG_WARNING(PREFIX " Not supported by any GPU, disabling...");
+ return;
+ }
+
+ // Create structure
+ static std::unique_ptr<obs_encoder_info> encoder_info = std::make_unique<obs_encoder_info>();
+ std::memset(encoder_info.get(), 0, sizeof(obs_encoder_info));
+
+ // Initialize Structure
+ encoder_info->type = obs_encoder_type::OBS_ENCODER_VIDEO;
+ static const char* encoder_name = "amd_amf_h265";
+ encoder_info->id = encoder_name;
+ static const char* encoder_codec = "hevc";
+ encoder_info->codec = encoder_codec;
+
+ // Functions
+ encoder_info->get_name = &get_name;
+ encoder_info->get_defaults = &get_defaults;
+ encoder_info->get_properties = &get_properties;
+ encoder_info->create = &create;
+ encoder_info->destroy = &destroy;
+ encoder_info->encode = &encode;
+ encoder_info->update = &update;
+ encoder_info->get_video_info = &get_video_info;
+ encoder_info->get_extra_data = &get_extra_data;
+
+ obs_register_encoder(encoder_info.get());
+ PLOG_DEBUG(PREFIX " Registered.");
+}
+
+const char* Plugin::Interface::H265Interface::get_name(void*) {
+ static const char* name = "H265/HEVC Encoder (" PLUGIN_NAME ")";
+ return name;
+}
+
+void Plugin::Interface::H265Interface::get_defaults(obs_data_t *data) {
+ #pragma region OBS - Enforce Streaming Service Restrictions
+ obs_data_set_default_int(data, "bitrate", -1);
+ obs_data_set_default_int(data, "keyint_sec", -1);
+ obs_data_set_default_string(data, "rate_control", "");
+ obs_data_set_default_string(data, "profile", "");
+ obs_data_set_default_string(data, "preset", "");
+ obs_data_set_int(data, "bitrate", -1);
+ obs_data_set_int(data, "keyint_sec", -1);
+ obs_data_set_string(data, "rate_control", "");
+ obs_data_set_string(data, "profile", "");
+ obs_data_set_string(data, "preset", "");
+ #pragma endregion OBS - Enforce Streaming Service Restrictions
+
+ // Static
+ //obs_data_set_default_int(data, P_USAGE, static_cast<int64_t>(Usage::Transcoding));
+ obs_data_set_default_int(data, P_QUALITYPRESET, static_cast<int64_t>(QualityPreset::Balanced));
+ obs_data_set_default_int(data, P_PROFILE, static_cast<int64_t>(Profile::Main));
+ obs_data_set_default_int(data, P_PROFILELEVEL, static_cast<int64_t>(ProfileLevel::Automatic));
+ obs_data_set_default_int(data, P_TIER, static_cast<int64_t>(H265::Tier::Main));
+ //obs_data_set_default_frames_per_second(data, P_ASPECTRATIO, media_frames_per_second{ 1, 1 }, "");
+ obs_data_set_default_int(data, P_CODINGTYPE, static_cast<int64_t>(CodingType::Automatic));
+ obs_data_set_default_int(data, P_MAXIMUMREFERENCEFRAMES, 1);
+
+ // Rate Control
+ obs_data_set_int(data, ("last" P_RATECONTROLMETHOD), -1);
+ obs_data_set_default_int(data, ("last" P_RATECONTROLMETHOD), -1);
+ obs_data_set_default_int(data, P_RATECONTROLMETHOD, static_cast<int64_t>(RateControlMethod::ConstantBitrate));
+ obs_data_set_default_int(data, P_PREPASSMODE, static_cast<int64_t>(PrePassMode::Disabled));
+ obs_data_set_default_int(data, P_BITRATE_TARGET, 3500);
+ obs_data_set_default_int(data, P_BITRATE_PEAK, 9000);
+ obs_data_set_default_int(data, P_QP_IFRAME, 22);
+ obs_data_set_default_int(data, P_QP_PFRAME, 22);
+ obs_data_set_default_int(data, P_QP_BFRAME, 22);
+ obs_data_set_default_int(data, P_QP_IFRAME_MINIMUM, 18);
+ obs_data_set_default_int(data, P_QP_IFRAME_MAXIMUM, 51);
+ obs_data_set_default_int(data, P_QP_PFRAME_MINIMUM, 18);
+ obs_data_set_default_int(data, P_QP_PFRAME_MAXIMUM, 51);
+ obs_data_set_default_int(data, P_FILLERDATA, 1);
+ obs_data_set_default_int(data, P_FRAMESKIPPING, 0);
+ obs_data_set_default_int(data, P_VBAQ, 1);
+ obs_data_set_default_int(data, P_ENFORCEHRD, 0);
+
+ // VBV Buffer
+ obs_data_set_int(data, ("last" P_VBVBUFFER), -1);
+ obs_data_set_default_int(data, ("last" P_VBVBUFFER), -1);
+ obs_data_set_default_int(data, P_VBVBUFFER, 0);
+ obs_data_set_default_int(data, P_VBVBUFFER_SIZE, 3500);
+ obs_data_set_default_double(data, P_VBVBUFFER_STRICTNESS, 50);
+ obs_data_set_default_double(data, P_VBVBUFFER_INITIALFULLNESS, 100);
+
+ // Picture Control
+ obs_data_set_default_double(data, P_INTERVAL_KEYFRAME, 2.0);
+ obs_data_set_default_int(data, P_PERIOD_IDR_H265, 0);
+ obs_data_set_default_double(data, P_INTERVAL_IFRAME, 0.0);
+ obs_data_set_default_int(data, P_PERIOD_IFRAME, 0);
+ obs_data_set_default_double(data, P_INTERVAL_PFRAME, 0.0);
+ obs_data_set_default_int(data, P_PERIOD_PFRAME, 0);
+ obs_data_set_default_int(data, P_FRAMESKIPPING_PERIOD, 0);
+ obs_data_set_default_int(data, P_FRAMESKIPPING_BEHAVIOUR, 0);
+ obs_data_set_default_int(data, P_GOP_TYPE, static_cast<int64_t>(H265::GOPType::Fixed));
+ obs_data_set_default_int(data, P_GOP_SIZE, 60);
+ obs_data_set_default_int(data, P_GOP_SIZE_MINIMUM, 1);
+ obs_data_set_default_int(data, P_GOP_SIZE_MAXIMUM, 16);
+ obs_data_set_default_int(data, P_DEBLOCKINGFILTER, 1);
+ obs_data_set_default_int(data, P_MOTIONESTIMATION, 3);
+
+ // System Properties
+ obs_data_set_string(data, ("last" P_VIDEO_API), "");
+ obs_data_set_default_string(data, ("last" P_VIDEO_API), "");
+ obs_data_set_default_string(data, P_VIDEO_API, "");
+ obs_data_set_int(data, ("last" P_VIDEO_ADAPTER), 0);
+ obs_data_set_default_int(data, ("last" P_VIDEO_ADAPTER), 0);
+ obs_data_set_default_int(data, P_VIDEO_ADAPTER, 0);
+ obs_data_set_default_int(data, P_OPENCL_TRANSFER, 0);
+ obs_data_set_default_int(data, P_OPENCL_CONVERSION, 0);
+ obs_data_set_default_int(data, P_ASYNCHRONOUSQUEUE, 0);
+ obs_data_set_default_int(data, P_ASYNCHRONOUSQUEUE_SIZE, 4);
+ obs_data_set_int(data, ("last" P_VIEW), -1);
+ obs_data_set_default_int(data, ("last" P_VIEW), -1);
+ obs_data_set_default_int(data, P_VIEW, static_cast<int64_t>(ViewMode::Basic));
+ obs_data_set_default_bool(data, P_DEBUG, false);
+ obs_data_set_default_int(data, P_VERSION, PLUGIN_VERSION_FULL);
+}
+
+static void fill_api_list(obs_property_t* p) {
+ obs_property_list_clear(p);
+ auto cm = CapabilityManager::Instance();
+
+ for (auto api : Plugin::API::EnumerateAPIs()) {
+ if (cm->IsCodecSupportedByAPI(Codec::HEVC, api->GetType()))
+ obs_property_list_add_string(p, api->GetName().c_str(), api->GetName().c_str());
+ }
+}
+
+static void fill_device_list(obs_property_t* p, const char* apiname) {
+ obs_property_list_clear(p);
+
+ auto cm = CapabilityManager::Instance();
+ auto api = Plugin::API::GetAPI(std::string(apiname));
+ for (auto adapter : api->EnumerateAdapters()) {
+ union {
+ int32_t id[2];
+ int64_t v;
+ } adapterid = { adapter.idLow, adapter.idHigh };
+ if (cm->IsCodecSupportedByAPIAdapter(Codec::HEVC, api->GetType(), adapter))
+ obs_property_list_add_int(p, adapter.Name.c_str(), adapterid.v);
+ }
+}
+
+obs_properties_t* Plugin::Interface::H265Interface::get_properties(void* data) {
+ obs_properties* props = obs_properties_create();
+ obs_property_t* p;
+
+ // Static Properties
+ #pragma region Quality Preset
+ p = obs_properties_add_list(props, P_QUALITYPRESET, P_TRANSLATE(P_QUALITYPRESET), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QUALITYPRESET)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_SPEED), static_cast<int32_t>(QualityPreset::Speed));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_BALANCED), static_cast<int32_t>(QualityPreset::Balanced));
+ obs_property_list_add_int(p, P_TRANSLATE(P_QUALITYPRESET_QUALITY), static_cast<int32_t>(QualityPreset::Quality));
+ #pragma endregion Quality Preset
+
+ #pragma region Profile, Levels
+ p = obs_properties_add_list(props, P_PROFILE, P_TRANSLATE(P_PROFILE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PROFILE)));
+ obs_property_list_add_int(p, "Main", static_cast<int32_t>(Profile::Main));
+
+ p = obs_properties_add_list(props, P_PROFILELEVEL, P_TRANSLATE(P_PROFILELEVEL), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PROFILELEVEL)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), static_cast<int32_t>(ProfileLevel::Automatic));
+ obs_property_list_add_int(p, "1.0", static_cast<int32_t>(ProfileLevel::L10));
+ obs_property_list_add_int(p, "2.0", static_cast<int32_t>(ProfileLevel::L20));
+ obs_property_list_add_int(p, "2.1", static_cast<int32_t>(ProfileLevel::L21));
+ obs_property_list_add_int(p, "3.0", static_cast<int32_t>(ProfileLevel::L30));
+ obs_property_list_add_int(p, "3.1", static_cast<int32_t>(ProfileLevel::L31));
+ obs_property_list_add_int(p, "4.0", static_cast<int32_t>(ProfileLevel::L40));
+ obs_property_list_add_int(p, "4.1", static_cast<int32_t>(ProfileLevel::L41));
+ obs_property_list_add_int(p, "5.0", static_cast<int32_t>(ProfileLevel::L50));
+ obs_property_list_add_int(p, "5.1", static_cast<int32_t>(ProfileLevel::L51));
+ obs_property_list_add_int(p, "5.2", static_cast<int32_t>(ProfileLevel::L52));
+ obs_property_list_add_int(p, "6.0", static_cast<int32_t>(ProfileLevel::L60));
+ obs_property_list_add_int(p, "6.1", static_cast<int32_t>(ProfileLevel::L61));
+ obs_property_list_add_int(p, "6.2", static_cast<int32_t>(ProfileLevel::L62));
+ #pragma endregion Profile, Levels
+
+ #pragma region Tier
+ p = obs_properties_add_list(props, P_TIER, P_TRANSLATE(P_TIER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_TIER)));
+ obs_property_list_add_int(p, "Main", static_cast<int32_t>(H265::Tier::Main));
+ obs_property_list_add_int(p, "High", static_cast<int32_t>(H265::Tier::High));
+ #pragma endregion Tier
+
+ #pragma region Coding Type
+ p = obs_properties_add_list(props, P_CODINGTYPE, P_TRANSLATE(P_CODINGTYPE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_CODINGTYPE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), static_cast<int32_t>(CodingType::Automatic));
+ obs_property_list_add_int(p, "CABAC", static_cast<int32_t>(CodingType::CABAC));
+ #pragma endregion Coding Type
+
+ #pragma region Maximum Reference Frames
+ p = obs_properties_add_int_slider(props, P_MAXIMUMREFERENCEFRAMES, P_TRANSLATE(P_MAXIMUMREFERENCEFRAMES),
+ 1, 16, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_MAXIMUMREFERENCEFRAMES)));
+ #pragma endregion Maximum Reference Frames
+
+ // Rate Control
+ #pragma region Rate Control Method
+ p = obs_properties_add_list(props, P_RATECONTROLMETHOD, P_TRANSLATE(P_RATECONTROLMETHOD), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_RATECONTROLMETHOD)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_CQP), static_cast<int32_t>(RateControlMethod::ConstantQP));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_CBR), static_cast<int32_t>(RateControlMethod::ConstantBitrate));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_VBR), static_cast<int32_t>(RateControlMethod::PeakConstrainedVariableBitrate));
+ obs_property_list_add_int(p, P_TRANSLATE(P_RATECONTROLMETHOD_VBRLAT), static_cast<int32_t>(RateControlMethod::LatencyConstrainedVariableBitrate));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion Rate Control Method
+
+ #pragma region Pre-Pass
+ p = obs_properties_add_list(props, P_PREPASSMODE, P_TRANSLATE(P_PREPASSMODE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PREPASSMODE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), static_cast<int32_t>(PrePassMode::Disabled));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), static_cast<int32_t>(PrePassMode::Enabled));
+ #pragma endregion Pre-Pass
+
+ #pragma region Parameters
+ /// Bitrate Constraints
+ p = obs_properties_add_int(props, P_BITRATE_TARGET, P_TRANSLATE(P_BITRATE_TARGET), 0, 1, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BITRATE_TARGET)));
+ p = obs_properties_add_int(props, P_BITRATE_PEAK, P_TRANSLATE(P_BITRATE_PEAK), 0, 1, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_BITRATE_PEAK)));
+
+ /// Method: Constant QP
+ p = obs_properties_add_int_slider(props, P_QP_IFRAME, P_TRANSLATE(P_QP_IFRAME), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_IFRAME)));
+ p = obs_properties_add_int_slider(props, P_QP_PFRAME, P_TRANSLATE(P_QP_PFRAME), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_PFRAME)));
+
+ /// Minimum QP, Maximum QP
+ p = obs_properties_add_int_slider(props, P_QP_IFRAME_MINIMUM, P_TRANSLATE(P_QP_IFRAME_MINIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_IFRAME_MINIMUM)));
+ p = obs_properties_add_int_slider(props, P_QP_IFRAME_MAXIMUM, P_TRANSLATE(P_QP_IFRAME_MAXIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_IFRAME_MAXIMUM)));
+ p = obs_properties_add_int_slider(props, P_QP_PFRAME_MINIMUM, P_TRANSLATE(P_QP_PFRAME_MINIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_PFRAME_MINIMUM)));
+ p = obs_properties_add_int_slider(props, P_QP_PFRAME_MAXIMUM, P_TRANSLATE(P_QP_PFRAME_MAXIMUM), 0, 51, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_QP_PFRAME_MAXIMUM)));
+ #pragma endregion Parameters
+
+ #pragma region Filler Data
+ p = obs_properties_add_list(props, P_FILLERDATA, P_TRANSLATE(P_FILLERDATA), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FILLERDATA)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Filler Data
+
+ #pragma region Frame Skipping
+ p = obs_properties_add_list(props, P_FRAMESKIPPING, P_TRANSLATE(P_FRAMESKIPPING), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ p = obs_properties_add_int(props, P_FRAMESKIPPING_PERIOD, P_TRANSLATE(P_FRAMESKIPPING_PERIOD), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING_PERIOD)));
+ p = obs_properties_add_list(props, P_FRAMESKIPPING_BEHAVIOUR, P_TRANSLATE(P_FRAMESKIPPING_BEHAVIOUR), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_FRAMESKIPPING_BEHAVIOUR)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_FRAMESKIPPING_SKIPNTH), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_FRAMESKIPPING_KEEPNTH), 1);
+ #pragma endregion Frame Skipping
+
+ #pragma region VBAQ
+ p = obs_properties_add_list(props, P_VBAQ, P_TRANSLATE(P_VBAQ), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBAQ)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion VBAQ
+
+ #pragma region Enforce Hypothetical Reference Decoder Restrictions
+ p = obs_properties_add_list(props, P_ENFORCEHRD, P_TRANSLATE(P_ENFORCEHRD), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ENFORCEHRD)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Enforce Hyptothetical Reference Decoder Restrictions
+
+ // VBV Buffer
+ #pragma region VBV Buffer Mode
+ p = obs_properties_add_list(props, P_VBVBUFFER, P_TRANSLATE(P_VBVBUFFER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_AUTOMATIC), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_MANUAL), 1);
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion VBV Buffer Mode
+
+ #pragma region VBV Buffer Strictness
+ p = obs_properties_add_float_slider(props, P_VBVBUFFER_STRICTNESS, P_TRANSLATE(P_VBVBUFFER_STRICTNESS), 0.0, 100.0, 0.1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_STRICTNESS)));
+ #pragma endregion VBV Buffer Strictness
+
+ #pragma region VBV Buffer Size
+ p = obs_properties_add_int_slider(props, P_VBVBUFFER_SIZE, P_TRANSLATE(P_VBVBUFFER_SIZE), 1, 1000000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_SIZE)));
+ #pragma endregion VBV Buffer Size
+
+ #pragma region VBV Buffer Initial Fullness
+ p = obs_properties_add_float_slider(props, P_VBVBUFFER_INITIALFULLNESS, P_TRANSLATE(P_VBVBUFFER_INITIALFULLNESS), 0.0, 100.0, 100.0 / 64.0);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VBVBUFFER_INITIALFULLNESS)));
+ #pragma endregion VBV Buffer Initial Fullness
+
+ // Picture Control
+ #pragma region Interval and Periods
+ /// Keyframe, IDR
+ p = obs_properties_add_float(props, P_INTERVAL_KEYFRAME, P_TRANSLATE(P_INTERVAL_KEYFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_KEYFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_IDR_H265, P_TRANSLATE(P_PERIOD_IDR_H265), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_IDR_H265)));
+ /// I-Frame
+ p = obs_properties_add_float(props, P_INTERVAL_IFRAME, P_TRANSLATE(P_INTERVAL_IFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_IFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_IFRAME, P_TRANSLATE(P_PERIOD_IFRAME), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_IFRAME)));
+ /// P-Frame
+ p = obs_properties_add_float(props, P_INTERVAL_PFRAME, P_TRANSLATE(P_INTERVAL_PFRAME), 0, 100, 0.001);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_INTERVAL_PFRAME)));
+ p = obs_properties_add_int(props, P_PERIOD_PFRAME, P_TRANSLATE(P_PERIOD_PFRAME), 0, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_PERIOD_PFRAME)));
+ #pragma endregion Interval and Periods
+
+ #pragma region GOP Type
+ p = obs_properties_add_list(props, P_GOP_TYPE, P_TRANSLATE(P_GOP_TYPE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_GOP_TYPE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_GOP_TYPE_FIXED), static_cast<int64_t>(H265::GOPType::Fixed));
+ obs_property_list_add_int(p, P_TRANSLATE(P_GOP_TYPE_VARIABLE), static_cast<int64_t>(H265::GOPType::Variable));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion GOP Type
+
+ #pragma region GOP Size
+ p = obs_properties_add_int(props, P_GOP_SIZE, P_TRANSLATE(P_GOP_SIZE), 1, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_GOP_SIZE)));
+
+ p = obs_properties_add_int(props, P_GOP_SIZE_MINIMUM, P_TRANSLATE(P_GOP_SIZE_MINIMUM), 1, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_GOP_SIZE_MINIMUM)));
+
+ p = obs_properties_add_int(props, P_GOP_SIZE_MAXIMUM, P_TRANSLATE(P_GOP_SIZE_MAXIMUM), 1, 1000, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_GOP_SIZE_MAXIMUM)));
+ #pragma endregion GOP Size
+
+ /// GOP Alignment?
+
+ #pragma region Deblocking Filter
+ p = obs_properties_add_list(props, P_DEBLOCKINGFILTER, P_TRANSLATE(P_DEBLOCKINGFILTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_DEBLOCKINGFILTER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion Deblocking Filter
+
+ #pragma region Motion Estimation
+ p = obs_properties_add_list(props, P_MOTIONESTIMATION, P_TRANSLATE(P_MOTIONESTIMATION), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_MOTIONESTIMATION)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_HALF), 1);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_QUARTER), 2);
+ obs_property_list_add_int(p, P_TRANSLATE(P_MOTIONESTIMATION_FULL), 3);
+ #pragma endregion Motion Estimation
+
+ // System
+ #pragma region Video APIs
+ p = obs_properties_add_list(props, P_VIDEO_API, P_TRANSLATE(P_VIDEO_API), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_STRING);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIDEO_API)));
+ obs_property_set_modified_callback(p, properties_modified);
+ fill_api_list(p);
+ #pragma endregion Video APIs
+
+ #pragma region Video Adapters
+ p = obs_properties_add_list(props, P_VIDEO_ADAPTER, P_TRANSLATE(P_VIDEO_ADAPTER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIDEO_ADAPTER)));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion Video Adapters
+
+ #pragma region OpenCL
+ p = obs_properties_add_list(props, P_OPENCL_TRANSFER, P_TRANSLATE(P_OPENCL_TRANSFER), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_OPENCL_TRANSFER)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+
+ p = obs_properties_add_list(props, P_OPENCL_CONVERSION, P_TRANSLATE(P_OPENCL_CONVERSION), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_OPENCL_CONVERSION)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+ #pragma endregion OpenCL
+
+ #pragma region Asynchronous Queue
+ p = obs_properties_add_list(props, P_ASYNCHRONOUSQUEUE, P_TRANSLATE(P_ASYNCHRONOUSQUEUE), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ASYNCHRONOUSQUEUE)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_DISABLED), 0);
+ obs_property_list_add_int(p, P_TRANSLATE(P_UTIL_SWITCH_ENABLED), 1);
+
+ p = obs_properties_add_int_slider(props, P_ASYNCHRONOUSQUEUE_SIZE, P_TRANSLATE(P_ASYNCHRONOUSQUEUE_SIZE), 1, 32, 1);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_ASYNCHRONOUSQUEUE_SIZE)));
+ #pragma endregion Asynchronous Queue
+
+ #pragma region View Mode
+ p = obs_properties_add_list(props, P_VIEW, P_TRANSLATE(P_VIEW), OBS_COMBO_TYPE_LIST, OBS_COMBO_FORMAT_INT);
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_VIEW)));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_BASIC), static_cast<int32_t>(ViewMode::Basic));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_ADVANCED), static_cast<int32_t>(ViewMode::Advanced));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_EXPERT), static_cast<int32_t>(ViewMode::Expert));
+ obs_property_list_add_int(p, P_TRANSLATE(P_VIEW_MASTER), static_cast<int32_t>(ViewMode::Master));
+ obs_property_set_modified_callback(p, properties_modified);
+ #pragma endregion View Mode
+
+ /// Debug
+ p = obs_properties_add_bool(props, P_DEBUG, P_TRANSLATE(P_DEBUG));
+ obs_property_set_long_description(p, P_TRANSLATE(P_DESC(P_DEBUG)));
+
+ // Disable non-dynamic properties if we have an encoder.
+ obs_properties_set_param(props, data, nullptr);
+
+ return props;
+}
+
+static void obs_data_default_single(obs_properties_t *props, obs_data_t *data, const char* name) {
+ obs_property_t *p = obs_properties_get(props, name);
+ switch (obs_property_get_type(p)) {
+ case OBS_PROPERTY_INVALID:
+ break;
+ case OBS_PROPERTY_BOOL:
+ obs_data_set_bool(data, name, obs_data_get_default_bool(data, name));
+ break;
+ case OBS_PROPERTY_INT:
+ obs_data_set_int(data, name, obs_data_get_default_int(data, name));
+ break;
+ case OBS_PROPERTY_FLOAT:
+ obs_data_set_double(data, name, obs_data_get_default_double(data, name));
+ break;
+ case OBS_PROPERTY_TEXT:
+ case OBS_PROPERTY_PATH:
+ obs_data_set_string(data, name, obs_data_get_default_string(data, name));
+ break;
+ case OBS_PROPERTY_LIST:
+ case OBS_PROPERTY_EDITABLE_LIST:
+ switch (obs_property_list_format(p)) {
+ case OBS_COMBO_FORMAT_INT:
+ obs_data_set_int(data, name, obs_data_get_default_int(data, name));
+ break;
+ case OBS_COMBO_FORMAT_FLOAT:
+ obs_data_set_double(data, name, obs_data_get_default_double(data, name));
+ break;
+ case OBS_COMBO_FORMAT_STRING:
+ obs_data_set_string(data, name, obs_data_get_default_string(data, name));
+ break;
+ }
+ break;
+ case OBS_PROPERTY_COLOR:
+ break;
+ case OBS_PROPERTY_BUTTON:
+ break;
+ case OBS_PROPERTY_FONT:
+ break;
+ case OBS_PROPERTY_FRAME_RATE:
+ break;
+ }
+}
+
+bool Plugin::Interface::H265Interface::properties_modified(obs_properties_t *props, obs_property_t *, obs_data_t *data) {
+ bool result = false;
+ obs_property_t* p;
+
+ #pragma region Video API & Adapter
+ // Video API
+ const char
+ *videoAPI_last = obs_data_get_string(data, ("last" P_VIDEO_API)),
+ *videoAPI_cur = obs_data_get_string(data, P_VIDEO_API);
+ if (strlen(videoAPI_cur) == 0) {
+ p = obs_properties_get(props, P_VIDEO_API);
+ obs_data_set_string(data, P_VIDEO_API, obs_property_list_item_string(p, 0));
+ videoAPI_cur = obs_data_get_string(data, P_VIDEO_API);
+
+ result = true;
+ }
+ /// If a different API was selected, rebuild the device list.
+ if (strcmp(videoAPI_last, videoAPI_cur) != 0) {
+ obs_data_set_string(data, ("last" P_VIDEO_API), videoAPI_cur);
+ fill_device_list(obs_properties_get(props, P_VIDEO_ADAPTER), videoAPI_cur);
+ result = true;
+
+ // Reset Video Adapter to first in list.
+ obs_data_set_int(data, P_VIDEO_ADAPTER,
+ obs_property_list_item_int(obs_properties_get(props, P_VIDEO_ADAPTER), 0));
+ }
+
+ // Video Adapter
+ int64_t
+ videoAdapter_last = obs_data_get_int(data, ("last" P_VIDEO_ADAPTER)),
+ videoAdapter_cur = obs_data_get_int(data, P_VIDEO_ADAPTER);
+ if (videoAdapter_last != videoAdapter_cur) {
+ obs_data_set_int(data, ("last" P_VIDEO_ADAPTER), videoAdapter_cur);
+ result = true;
+
+ auto api = Plugin::API::GetAPI(obs_data_get_string(data, P_VIDEO_API));
+ union {
+ int64_t v;
+ uint32_t id[2];
+ } adapterid = { videoAdapter_cur };
+ auto adapter = api->GetAdapterById(adapterid.id[0], adapterid.id[1]);
+ try {
+ auto enc = EncoderH265(api, adapter);
+
+ #define TEMP_LIMIT_DROPDOWN(func, enm, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ enm tmp_s = static_cast<enm>(obs_data_get_int(data, obs_property_name(tmp_p))); \
+ for (size_t idx = 0; idx < obs_property_list_item_count(tmp_p); idx++) { \
+ bool enabled = false; \
+ enm tmp_v = static_cast<enm>(obs_property_list_item_int(tmp_p, idx)); \
+ for (auto tmp_k : tmp_l) { \
+ if (tmp_k == tmp_v) { \
+ enabled = true; \
+ break; \
+ } \
+ } \
+ obs_property_list_item_disable(tmp_p, idx, !enabled); \
+ if ((enabled == false) && (tmp_s == tmp_v)) \
+ obs_data_default_single(props, data, obs_property_name(tmp_p)); \
+ } \
+ }
+ #define TEMP_LIMIT_SLIDER(func, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ obs_property_int_set_limits(tmp_p, (int)tmp_l.first, (int)tmp_l.second, 1); \
+ }
+ #define TEMP_LIMIT_SLIDER_BITRATE(func, prop) { \
+ auto tmp_p = obs_properties_get(props, prop); \
+ auto tmp_l = enc.func(); \
+ obs_property_int_set_limits(tmp_p, (int)tmp_l.first / 1000, (int)tmp_l.second / 1000, 1); \
+ }
+
+ //TEMP_LIMIT_DROPDOWN(CapsUsage, AMD::Usage, P_USAGE);
+ TEMP_LIMIT_DROPDOWN(CapsQualityPreset, AMD::QualityPreset, P_QUALITYPRESET);
+ TEMP_LIMIT_DROPDOWN(CapsProfile, AMD::Profile, P_PROFILE);
+ TEMP_LIMIT_DROPDOWN(CapsProfileLevel, AMD::ProfileLevel, P_PROFILELEVEL);
+ {
+ auto tmp_p = obs_properties_get(props, P_PROFILELEVEL);
+ obs_property_list_item_disable(tmp_p, 0, false);
+ }
+ TEMP_LIMIT_DROPDOWN(CapsTier, AMD::H265::Tier, P_TIER);
+ // Aspect Ratio - No limits, only affects players/transcoders
+ TEMP_LIMIT_DROPDOWN(CapsCodingType, AMD::CodingType, P_CODINGTYPE);
+ TEMP_LIMIT_SLIDER(CapsMaximumReferenceFrames, P_MAXIMUMREFERENCEFRAMES);
+ TEMP_LIMIT_DROPDOWN(CapsRateControlMethod, AMD::RateControlMethod, P_RATECONTROLMETHOD);
+ TEMP_LIMIT_DROPDOWN(CapsPrePassMode, AMD::PrePassMode, P_PREPASSMODE);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsTargetBitrate, P_BITRATE_TARGET);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsPeakBitrate, P_BITRATE_PEAK);
+ TEMP_LIMIT_SLIDER_BITRATE(CapsVBVBufferSize, P_VBVBUFFER_SIZE);
+ } catch (const std::exception& e) {
+ PLOG_ERROR("Exception occured while updating capabilities: %s",
+ e.what());
+ }
+ }
+ #pragma endregion Video API & Adapter
+
+ #pragma region View Mode
+ ViewMode lastView = static_cast<ViewMode>(obs_data_get_int(data, ("last" P_VIEW))),
+ curView = static_cast<ViewMode>(obs_data_get_int(data, P_VIEW));
+ if (lastView != curView) {
+ obs_data_set_int(data, ("last" P_VIEW), static_cast<int32_t>(curView));
+ result = true;
+ }
+
+ std::vector<std::pair<const char*, ViewMode>> viewstuff = {
+ //std::make_pair(P_PRESET, ViewMode::Basic),
+ // ----------- Static Section
+ //std::make_pair(P_USAGE, ViewMode::Master),
+ std::make_pair(P_QUALITYPRESET, ViewMode::Basic),
+ std::make_pair(P_PROFILE, ViewMode::Advanced),
+ std::make_pair(P_PROFILELEVEL, ViewMode::Advanced),
+ std::make_pair(P_TIER, ViewMode::Advanced),
+ std::make_pair(P_ASPECTRATIO, ViewMode::Master),
+ std::make_pair(P_CODINGTYPE, ViewMode::Expert),
+ std::make_pair(P_MAXIMUMREFERENCEFRAMES, ViewMode::Expert),
+ // ----------- Rate Control Section
+ std::make_pair(P_RATECONTROLMETHOD, ViewMode::Basic),
+ std::make_pair(P_PREPASSMODE, ViewMode::Basic),
+ //std::make_pair(P_BITRATE_TARGET, ViewMode::Basic),
+ //std::make_pair(P_BITRATE_PEAK, ViewMode::Basic),
+ //std::make_pair(P_QP_IFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_PFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_BFRAME, ViewMode::Basic),
+ //std::make_pair(P_QP_MINIMUM, ViewMode::Advanced),
+ //std::make_pair(P_QP_MAXIMUM, ViewMode::Advanced),
+ //std::make_pair(P_FILLERDATA, ViewMode::Basic),
+ std::make_pair(P_FRAMESKIPPING, ViewMode::Advanced),
+ std::make_pair(P_FRAMESKIPPING_PERIOD, ViewMode::Master),
+ std::make_pair(P_FRAMESKIPPING_BEHAVIOUR, ViewMode::Master),
+ //std::make_pair(P_VBAQ, ViewMode::Expert),
+ std::make_pair(P_ENFORCEHRD, ViewMode::Expert),
+ // ----------- VBV Buffer
+ std::make_pair(P_VBVBUFFER, ViewMode::Advanced),
+ //std::make_pair(P_VBVBUFFER_STRICTNESS, ViewMode::Advanced),
+ //std::make_pair(P_VBVBUFFER_SIZE, ViewMode::Advanced),
+ std::make_pair(P_VBVBUFFER_INITIALFULLNESS, ViewMode::Expert),
+ // ----------- Picture Control
+ std::make_pair(P_INTERVAL_KEYFRAME, ViewMode::Basic),
+ std::make_pair(P_PERIOD_IDR_H265, ViewMode::Master),
+ std::make_pair(P_INTERVAL_IFRAME, ViewMode::Master),
+ std::make_pair(P_PERIOD_IFRAME, ViewMode::Master),
+ std::make_pair(P_INTERVAL_PFRAME, ViewMode::Master),
+ std::make_pair(P_PERIOD_PFRAME, ViewMode::Master),
+ std::make_pair(P_GOP_TYPE, ViewMode::Expert),
+ //std::make_pair(P_GOP_SIZE, ViewMode::Expert),
+ //std::make_pair(P_GOP_SIZE_MINIMUM, ViewMode::Expert),
+ //std::make_pair(P_GOP_SIZE_MAXIMUM, ViewMode::Expert),
+ std::make_pair(P_DEBLOCKINGFILTER, ViewMode::Expert),
+ std::make_pair(P_MOTIONESTIMATION, ViewMode::Expert),
+ // ----------- Intra-Refresh
+ //std::make_pair("", ViewMode::Master),
+ // ----------- System
+ std::make_pair(P_VIDEO_API, ViewMode::Advanced),
+ std::make_pair(P_VIDEO_ADAPTER, ViewMode::Advanced),
+ std::make_pair(P_OPENCL_TRANSFER, ViewMode::Advanced),
+ std::make_pair(P_OPENCL_CONVERSION, ViewMode::Advanced),
+ std::make_pair(P_ASYNCHRONOUSQUEUE, ViewMode::Expert),
+ std::make_pair(P_ASYNCHRONOUSQUEUE_SIZE, ViewMode::Expert),
+ std::make_pair(P_VIEW, ViewMode::Basic),
+ std::make_pair(P_DEBUG, ViewMode::Basic),
+ };
+ for (std::pair<const char*, ViewMode> kv : viewstuff) {
+ bool vis = curView >= kv.second;
+ obs_property_set_visible(obs_properties_get(props, kv.first), vis);
+ if (!vis)
+ obs_data_default_single(props, data, kv.first);
+ }
+
+ #pragma region Rate Control
+ bool vis_rcm_bitrate_target = false,
+ vis_rcm_bitrate_peak = false,
+ vis_rcm_qp = false,
+ vis_rcm_fillerdata = false;
+
+ RateControlMethod lastRCM = static_cast<RateControlMethod>(obs_data_get_int(data, ("last" P_RATECONTROLMETHOD))),
+ curRCM = static_cast<RateControlMethod>(obs_data_get_int(data, P_RATECONTROLMETHOD));
+ if (lastRCM != curRCM) {
+ obs_data_set_int(data, ("last" P_RATECONTROLMETHOD), static_cast<int32_t>(curRCM));
+ result = true;
+ }
+ switch (curRCM) {
+ case RateControlMethod::ConstantQP:
+ vis_rcm_qp = true;
+ break;
+ case RateControlMethod::ConstantBitrate:
+ vis_rcm_bitrate_target = true;
+ vis_rcm_fillerdata = true;
+ break;
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ vis_rcm_bitrate_target = true;
+ vis_rcm_bitrate_peak = true;
+ break;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ vis_rcm_bitrate_target = true;
+ vis_rcm_bitrate_peak = true;
+ break;
+ }
+
+ /// Bitrate
+ obs_property_set_visible(obs_properties_get(props, P_BITRATE_TARGET), vis_rcm_bitrate_target);
+ if (!vis_rcm_bitrate_target)
+ obs_data_default_single(props, data, P_BITRATE_TARGET);
+ obs_property_set_visible(obs_properties_get(props, P_BITRATE_PEAK), vis_rcm_bitrate_peak);
+ if (!vis_rcm_bitrate_peak)
+ obs_data_default_single(props, data, P_BITRATE_PEAK);
+
+ /// QP
+ obs_property_set_visible(obs_properties_get(props, P_QP_IFRAME), vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_PFRAME), vis_rcm_qp);
+ if (!vis_rcm_qp) {
+ obs_data_default_single(props, data, P_QP_IFRAME);
+ obs_data_default_single(props, data, P_QP_PFRAME);
+ }
+
+ /// QP Min/Max
+ obs_property_set_visible(obs_properties_get(props, P_QP_IFRAME_MINIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_IFRAME_MAXIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_PFRAME_MINIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ obs_property_set_visible(obs_properties_get(props, P_QP_PFRAME_MAXIMUM), (curView >= ViewMode::Advanced) && !vis_rcm_qp);
+ if (!(curView >= ViewMode::Advanced) || vis_rcm_qp) {
+ obs_data_default_single(props, data, P_QP_IFRAME_MINIMUM);
+ obs_data_default_single(props, data, P_QP_IFRAME_MAXIMUM);
+ obs_data_default_single(props, data, P_QP_PFRAME_MINIMUM);
+ obs_data_default_single(props, data, P_QP_PFRAME_MAXIMUM);
+ }
+
+ /// Filler Data (CBR only at the moment)
+ obs_property_set_visible(obs_properties_get(props, P_FILLERDATA), vis_rcm_fillerdata);
+ if (!vis_rcm_fillerdata)
+ obs_data_default_single(props, data, P_FILLERDATA);
+
+ /// VBAQ (Causes issues with Constant QP)
+ obs_property_set_visible(obs_properties_get(props, P_VBAQ), (curView >= ViewMode::Expert) && !vis_rcm_qp);
+ if (!(curView >= ViewMode::Expert) || vis_rcm_qp) {
+ obs_data_default_single(props, data, P_VBAQ);
+ }
+ #pragma endregion Rate Control
+
+ #pragma region VBV Buffer
+ uint32_t vbvBufferMode = static_cast<uint32_t>(obs_data_get_int(data, P_VBVBUFFER));
+ bool vbvBufferVisible = (curView >= ViewMode::Advanced);
+
+ uint32_t lastVBVBufferMode = static_cast<uint32_t>(obs_data_get_int(data, ("last" P_VBVBUFFER)));
+ if (lastVBVBufferMode != vbvBufferMode) {
+ obs_data_set_int(data, ("last" P_VBVBUFFER), vbvBufferMode);
+ result = true;
+ }
+
+ obs_property_set_visible(obs_properties_get(props, P_VBVBUFFER_STRICTNESS), vbvBufferVisible && (vbvBufferMode == 0));
+ obs_property_set_visible(obs_properties_get(props, P_VBVBUFFER_SIZE), vbvBufferVisible && (vbvBufferMode == 1));
+ if (!vbvBufferVisible || vbvBufferMode == 0)
+ obs_data_default_single(props, data, P_VBVBUFFER_SIZE);
+ if (!vbvBufferVisible || vbvBufferMode == 1)
+ obs_data_default_single(props, data, P_VBVBUFFER_STRICTNESS);
+ #pragma endregion VBV Buffer
+
+ #pragma region GOP
+ bool gopvisible = (curView >= ViewMode::Expert);
+ bool goptype_fixed = (static_cast<H265::GOPType>(obs_data_get_int(data, P_GOP_TYPE)) == H265::GOPType::Fixed);
+ obs_property_set_visible(obs_properties_get(props, P_GOP_SIZE), goptype_fixed && gopvisible);
+ obs_property_set_visible(obs_properties_get(props, P_GOP_SIZE_MINIMUM), !goptype_fixed && gopvisible);
+ obs_property_set_visible(obs_properties_get(props, P_GOP_SIZE_MAXIMUM), !goptype_fixed && gopvisible);
+ if (!goptype_fixed) {
+ obs_data_default_single(props, data, P_GOP_SIZE);
+ } else if (goptype_fixed) {
+ obs_data_default_single(props, data, P_GOP_SIZE_MINIMUM);
+ obs_data_default_single(props, data, P_GOP_SIZE_MAXIMUM);
+ }
+ #pragma endregion GOP
+ #pragma endregion View Mode
+
+ // Permanently disable static properties while encoding.
+ void* enc = obs_properties_get_param(props);
+ if (enc) {
+ std::vector<const char*> hiddenProperties = {
+ // Static
+ ///P_USAGE,
+ P_QUALITYPRESET,
+ P_PROFILE,
+ P_PROFILELEVEL,
+ P_TIER,
+ P_CODINGTYPE,
+ P_MAXIMUMREFERENCEFRAMES,
+
+ /// Rate Control
+ P_RATECONTROLMETHOD,
+ P_VBVBUFFER,
+ P_VBVBUFFER_STRICTNESS,
+ P_VBVBUFFER_SIZE,
+ P_VBVBUFFER_INITIALFULLNESS,
+ P_PREPASSMODE,
+ P_VBAQ,
+
+ /// Picture Control
+ P_GOP_SIZE,
+ P_GOP_SIZE_MAXIMUM,
+ P_GOP_SIZE_MINIMUM,
+ P_GOP_TYPE,
+ P_INTERVAL_KEYFRAME,
+ P_PERIOD_IDR_H265,
+ P_DEBLOCKINGFILTER,
+ P_MOTIONESTIMATION,
+
+ // System
+ P_VIDEO_API,
+ P_VIDEO_ADAPTER,
+ P_OPENCL_TRANSFER,
+ P_OPENCL_CONVERSION,
+ P_ASYNCHRONOUSQUEUE,
+ P_ASYNCHRONOUSQUEUE_SIZE,
+ P_DEBUG,
+ };
+ for (const char* pr : hiddenProperties) {
+ obs_property_set_enabled(obs_properties_get(props, pr), false);
+ }
+ }
+
+ return true;
+}
+
+void* Plugin::Interface::H265Interface::create(obs_data_t* data, obs_encoder_t* encoder) {
+ try {
+ return new H265Interface(data, encoder);
+ } catch (std::exception e) {
+ PLOG_ERROR("%s", e.what());
+ }
+ return nullptr;
+}
+
+Plugin::Interface::H265Interface::H265Interface(obs_data_t* data, obs_encoder_t* encoder) {
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Initializing...");
+
+ m_Encoder = encoder;
+
+ // OBS Settings
+ uint32_t obsWidth = obs_encoder_get_width(encoder);
+ uint32_t obsHeight = obs_encoder_get_height(encoder);
+ video_t *obsVideoInfo = obs_encoder_video(encoder);
+ const struct video_output_info *voi = video_output_get_info(obsVideoInfo);
+ uint32_t obsFPSnum = voi->fps_num;
+ uint32_t obsFPSden = voi->fps_den;
+
+ //////////////////////////////////////////////////////////////////////////
+ /// Initialize Encoder
+ bool debug = obs_data_get_bool(data, P_DEBUG);
+ Plugin::AMD::AMF::Instance()->EnableDebugTrace(debug);
+
+ ColorFormat colorFormat = ColorFormat::NV12;
+ switch (voi->format) {
+ case VIDEO_FORMAT_NV12:
+ colorFormat = ColorFormat::NV12;
+ break;
+ case VIDEO_FORMAT_I420:
+ colorFormat = ColorFormat::I420;
+ break;
+ case VIDEO_FORMAT_YUY2:
+ colorFormat = ColorFormat::YUY2;
+ break;
+ case VIDEO_FORMAT_RGBA:
+ colorFormat = ColorFormat::RGBA;
+ break;
+ case VIDEO_FORMAT_BGRA:
+ colorFormat = ColorFormat::BGRA;
+ break;
+ case VIDEO_FORMAT_Y800:
+ colorFormat = ColorFormat::GRAY;
+ break;
+ }
+ ColorSpace colorSpace = ColorSpace::BT601;
+ switch (voi->colorspace) {
+ case VIDEO_CS_601:
+ colorSpace = ColorSpace::BT601;
+ break;
+ case VIDEO_CS_DEFAULT:
+ case VIDEO_CS_709:
+ colorSpace = ColorSpace::BT709;
+ break;
+ }
+
+ auto api = API::GetAPI(obs_data_get_string(data, P_VIDEO_API));
+ union {
+ int64_t v;
+ uint32_t id[2];
+ } adapterid = { obs_data_get_int(data, P_VIDEO_ADAPTER) };
+ auto adapter = api->GetAdapterById(adapterid.id[0], adapterid.id[1]);
+
+ m_VideoEncoder = std::make_unique<EncoderH265>(api, adapter,
+ !!obs_data_get_int(data, P_OPENCL_TRANSFER), !!obs_data_get_int(data, P_OPENCL_CONVERSION),
+ colorFormat, colorSpace, voi->range == VIDEO_RANGE_FULL,
+ !!obs_data_get_int(data, P_ASYNCHRONOUSQUEUE), (size_t)obs_data_get_int(data, P_ASYNCHRONOUSQUEUE_SIZE));
+
+ /// Static Properties
+ m_VideoEncoder->SetUsage(Usage::Transcoding);
+ m_VideoEncoder->SetQualityPreset(static_cast<QualityPreset>(obs_data_get_int(data, P_QUALITYPRESET)));
+
+ /// Frame
+ m_VideoEncoder->SetResolution(std::make_pair(obsWidth, obsHeight));
+ m_VideoEncoder->SetFrameRate(std::make_pair(obsFPSnum, obsFPSden));
+ ///- Aspect Ratio
+
+ /// Profile & Level
+ m_VideoEncoder->SetProfile(static_cast<Profile>(obs_data_get_int(data, P_PROFILE)));
+ m_VideoEncoder->SetProfileLevel(static_cast<ProfileLevel>(obs_data_get_int(data, P_PROFILELEVEL)));
+ m_VideoEncoder->SetTier(static_cast<H265::Tier>(obs_data_get_int(data, P_TIER)));
+
+ try {
+ m_VideoEncoder->SetCodingType(static_cast<CodingType>(obs_data_get_int(data, P_CODINGTYPE)));
+ } catch (...) {
+ }
+ try {
+ m_VideoEncoder->SetMaximumReferenceFrames(obs_data_get_int(data, P_MAXIMUMREFERENCEFRAMES));
+ } catch (...) {
+ }
+
+ // Rate Control
+ m_VideoEncoder->SetRateControlMethod(static_cast<RateControlMethod>(obs_data_get_int(data, P_RATECONTROLMETHOD)));
+ if (obs_data_get_int(data, P_VBVBUFFER) == 0) {
+ m_VideoEncoder->SetVBVBufferStrictness(obs_data_get_double(data, P_VBVBUFFER_STRICTNESS) / 100.0);
+ } else {
+ m_VideoEncoder->SetVBVBufferSize(static_cast<uint32_t>(obs_data_get_int(data, P_VBVBUFFER_SIZE) * 1000));
+ }
+ m_VideoEncoder->SetVBVBufferInitialFullness(obs_data_get_double(data, P_VBVBUFFER_INITIALFULLNESS) / 100.0f);
+ m_VideoEncoder->SetPrePassMode(static_cast<PrePassMode>(obs_data_get_int(data, P_PREPASSMODE)));
+ m_VideoEncoder->SetVarianceBasedAdaptiveQuantizationEnabled((!!obs_data_get_int(data, P_VBAQ)) && (m_VideoEncoder->GetRateControlMethod() != RateControlMethod::ConstantQP));
+
+ // Picture Control
+ uint32_t gopSize = static_cast<uint32_t>(floor(obsFPSden / (double_t)obsFPSnum));
+ H265::GOPType gopType = static_cast<H265::GOPType>(obs_data_get_int(data, P_GOP_TYPE));
+ m_VideoEncoder->SetGOPType(gopType);
+ if (static_cast<ViewMode>(obs_data_get_int(data, P_VIEW)) >= ViewMode::Expert) {
+ switch (gopType) {
+ case H265::GOPType::Fixed:
+ gopSize = (uint32_t)obs_data_get_int(data, P_GOP_SIZE);
+ break;
+ case H265::GOPType::Variable:
+ gopSize = (uint32_t)(obs_data_get_int(data, P_GOP_SIZE_MINIMUM) + obs_data_get_int(data, P_GOP_SIZE_MAXIMUM)) / 2;
+ m_VideoEncoder->SetGOPSizeMin((uint32_t)obs_data_get_int(data, P_GOP_SIZE_MINIMUM));
+ m_VideoEncoder->SetGOPSizeMax((uint32_t)obs_data_get_int(data, P_GOP_SIZE_MAXIMUM));
+ break;
+ }
+ }
+ m_VideoEncoder->SetGOPSize(gopSize);
+ /// Keyframe Interval/Period
+ double_t framerate = (double_t)obsFPSnum / (double_t)obsFPSden;
+ {
+ uint32_t idrperiod = static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_IDR_H265));
+ if (idrperiod == 0) {
+ double_t keyinterv = obs_data_get_double(data, P_INTERVAL_KEYFRAME);
+ idrperiod = static_cast<uint32_t>(ceil((keyinterv * framerate) / gopSize));
+ }
+ m_VideoEncoder->SetIDRPeriod(idrperiod);
+ }
+ m_VideoEncoder->SetDeblockingFilterEnabled(!!obs_data_get_int(data, P_DEBLOCKINGFILTER));
+ m_VideoEncoder->SetMotionEstimationHalfPixelEnabled(!!(obs_data_get_int(data, P_MOTIONESTIMATION) & 1));
+ m_VideoEncoder->SetMotionEstimationQuarterPixelEnabled(!!(obs_data_get_int(data, P_MOTIONESTIMATION) & 2));
+
+ // OBS - Enforce Streaming Service Restrictions
+ #pragma region OBS - Enforce Streaming Service Restrictions
+ {
+ // Profile
+ const char* p_str = obs_data_get_string(data, "profile");
+ if (strcmp(p_str, "") != 0) {
+ if (strcmp(p_str, "main")) {
+ m_VideoEncoder->SetProfile(Profile::Main);
+ }
+ } else {
+ switch (m_VideoEncoder->GetProfile()) {
+ case Profile::Main:
+ obs_data_set_string(data, "profile", "main");
+ break;
+ }
+ }
+
+ // Preset
+ const char* preset = obs_data_get_string(data, "preset");
+ if (strcmp(preset, "") != 0) {
+ if (strcmp(preset, "speed") == 0) {
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Speed);
+ } else if (strcmp(preset, "balanced") == 0) {
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Balanced);
+ } else if (strcmp(preset, "quality") == 0) {
+ m_VideoEncoder->SetQualityPreset(QualityPreset::Quality);
+ }
+ obs_data_set_int(data, P_QUALITYPRESET, (int32_t)m_VideoEncoder->GetQualityPreset());
+ } else {
+ switch (m_VideoEncoder->GetQualityPreset()) {
+ case QualityPreset::Speed:
+ obs_data_set_string(data, "preset", "speed");
+ break;
+ case QualityPreset::Balanced:
+ obs_data_set_string(data, "preset", "balanced");
+ break;
+ case QualityPreset::Quality:
+ obs_data_set_string(data, "preset", "quality");
+ break;
+ }
+ }
+
+ // Rate Control Method
+ const char* t_str = obs_data_get_string(data, "rate_control");
+ if (strcmp(t_str, "") != 0) {
+ if (strcmp(t_str, "CBR") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::ConstantBitrate);
+ m_VideoEncoder->SetFillerDataEnabled(true);
+ } else if (strcmp(t_str, "VBR") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::PeakConstrainedVariableBitrate);
+ } else if (strcmp(t_str, "VBR_LAT") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::LatencyConstrainedVariableBitrate);
+ } else if (strcmp(t_str, "CQP") == 0) {
+ m_VideoEncoder->SetRateControlMethod(RateControlMethod::ConstantQP);
+ }
+
+ obs_data_set_int(data, P_RATECONTROLMETHOD, (int32_t)m_VideoEncoder->GetRateControlMethod());
+ } else {
+ switch (m_VideoEncoder->GetRateControlMethod()) {
+ case RateControlMethod::ConstantBitrate:
+ obs_data_set_string(data, "rate_control", "CBR");
+ break;
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ obs_data_set_string(data, "rate_control", "VBR");
+ break;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ obs_data_set_string(data, "rate_control", "VBR_LAT");
+ break;
+ case RateControlMethod::ConstantQP:
+ obs_data_set_string(data, "rate_control", "CQP");
+ break;
+ }
+ }
+
+ // IDR-Period (Keyframes)
+ //uint32_t fpsNum = m_VideoEncoder->GetFrameRate().first;
+ //uint32_t fpsDen = m_VideoEncoder->GetFrameRate().second;
+ //if (obs_data_get_int(data, "keyint_sec") != -1) {
+ // m_VideoEncoder->SetIDRPeriod(static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
+
+ // obs_data_set_double(data, P_INTERVAL_KEYFRAME, static_cast<double_t>(obs_data_get_int(data, "keyint_sec")));
+ // obs_data_set_int(data, P_PERIOD_IDR_H264, static_cast<uint32_t>(obs_data_get_int(data, "keyint_sec") * (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
+ //} else {
+ // obs_data_set_int(data, "keyint_sec", static_cast<uint64_t>(m_VideoEncoder->GetIDRPeriod() / (static_cast<double_t>(fpsNum) / static_cast<double_t>(fpsDen))));
+ //}
+ }
+ #pragma endregion OBS - Enforce Streaming Service Restrictions
+
+ // Dynamic Properties (Can be changed during Encoding)
+ this->update(data);
+
+ // Initialize (locks static properties)
+ try {
+ m_VideoEncoder->Start();
+ } catch (...) {
+ throw;
+ }
+
+ // Dynamic Properties (Can be changed during Encoding)
+ //this->update(data);
+
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Complete.");
+}
+
+void Plugin::Interface::H265Interface::destroy(void* ptr) {
+ if (ptr)
+ delete static_cast<H265Interface*>(ptr);
+}
+
+Plugin::Interface::H265Interface::~H265Interface() {
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Finalizing...");
+ if (m_VideoEncoder) {
+ m_VideoEncoder->Stop();
+ m_VideoEncoder = nullptr;
+ }
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Complete.");
+}
+
+bool Plugin::Interface::H265Interface::update(void *ptr, obs_data_t *settings) {
+ if (ptr)
+ return static_cast<H265Interface*>(ptr)->update(settings);
+ return false;
+}
+
+bool Plugin::Interface::H265Interface::update(obs_data_t* data) {
+ const video_t *obsVideoInfo = obs_encoder_video(m_Encoder);
+ const struct video_output_info *voi = video_output_get_info(obsVideoInfo);
+ uint32_t obsFPSnum = voi->fps_num;
+ uint32_t obsFPSden = voi->fps_den;
+
+ // Rate Control
+ m_VideoEncoder->SetIFrameQPMinimum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_IFRAME_MINIMUM)));
+ m_VideoEncoder->SetIFrameQPMaximum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_IFRAME_MAXIMUM)));
+ m_VideoEncoder->SetPFrameQPMinimum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_PFRAME_MINIMUM)));
+ m_VideoEncoder->SetPFrameQPMaximum(static_cast<uint8_t>(obs_data_get_int(data, P_QP_PFRAME_MAXIMUM)));
+ switch (m_VideoEncoder->GetRateControlMethod()) {
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_PEAK) * 1000));
+ m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ break;
+ case RateControlMethod::ConstantBitrate:
+ m_VideoEncoder->SetPeakBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ m_VideoEncoder->SetTargetBitrate(static_cast<uint32_t>(obs_data_get_int(data, P_BITRATE_TARGET) * 1000));
+ break;
+ case RateControlMethod::ConstantQP:
+ m_VideoEncoder->SetIFrameQP(static_cast<uint8_t>(obs_data_get_int(data, P_QP_IFRAME)));
+ m_VideoEncoder->SetPFrameQP(static_cast<uint8_t>(obs_data_get_int(data, P_QP_PFRAME)));
+ break;
+ }
+ m_VideoEncoder->SetFrameSkippingEnabled(!!obs_data_get_int(data, P_FRAMESKIPPING));
+ m_VideoEncoder->SetEnforceHRDEnabled(!!obs_data_get_int(data, P_ENFORCEHRD));
+ m_VideoEncoder->SetFillerDataEnabled(!!obs_data_get_int(data, P_FILLERDATA));
+
+ // Picture Control
+ double_t framerate = (double_t)obsFPSnum / (double_t)obsFPSden;
+ /// I/P/Skip Frame Interval/Period
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_IFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_IFRAME)));
+ m_VideoEncoder->SetIFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_PFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_PFRAME)));
+ m_VideoEncoder->SetPFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_double(data, P_INTERVAL_BFRAME) * framerate);
+ period = max(period, static_cast<uint32_t>(obs_data_get_int(data, P_PERIOD_BFRAME)));
+ m_VideoEncoder->SetBFramePeriod(period);
+ }
+ {
+ uint32_t period = static_cast<uint32_t>(obs_data_get_int(data, P_FRAMESKIPPING_PERIOD));
+ m_VideoEncoder->SetFrameSkippingPeriod(period);
+ m_VideoEncoder->SetFrameSkippingBehaviour(!!obs_data_get_int(data, P_FRAMESKIPPING_BEHAVIOUR));
+ }
+
+ if (m_VideoEncoder->IsStarted()) {
+ m_VideoEncoder->LogProperties();
+ if (static_cast<ViewMode>(obs_data_get_int(data, P_VIEW)) >= ViewMode::Master)
+ PLOG_ERROR("View Mode 'Master' is active, avoid giving anything but basic support. Error is most likely caused by user settings themselves.");
+ }
+
+ return true;
+}
+
+bool Plugin::Interface::H265Interface::encode(void *ptr, struct encoder_frame * frame, struct encoder_packet * packet, bool * received_packet) {
+ if (ptr)
+ return static_cast<H265Interface*>(ptr)->encode(frame, packet, received_packet);
+ return false;
+}
+
+bool Plugin::Interface::H265Interface::encode(struct encoder_frame * frame, struct encoder_packet * packet, bool * received_packet) {
+ if (!frame || !packet || !received_packet)
+ return false;
+
+ try {
+ return m_VideoEncoder->Encode(frame, packet, received_packet);
+ } catch (std::exception e) {
+ PLOG_ERROR("Exception during encoding: %s", e.what());
+ } catch (...) {
+ PLOG_ERROR("Unknown exception during encoding.");
+ }
+ return false;
+}
+
+void Plugin::Interface::H265Interface::get_video_info(void *ptr, struct video_scale_info *info) {
+ if (ptr)
+ static_cast<H265Interface*>(ptr)->get_video_info(info);
+}
+
+void Plugin::Interface::H265Interface::get_video_info(struct video_scale_info* info) {
+ m_VideoEncoder->GetVideoInfo(info);
+}
+
+bool Plugin::Interface::H265Interface::get_extra_data(void *ptr, uint8_t** extra_data, size_t* size) {
+ if (ptr)
+ return static_cast<H265Interface*>(ptr)->get_extra_data(extra_data, size);
+ return false;
+}
+
+bool Plugin::Interface::H265Interface::get_extra_data(uint8_t** extra_data, size_t* size) {
+ return m_VideoEncoder->GetExtraData(extra_data, size);
+}
obs-studio-18.0.1.tar.xz/plugins/enc-amf/Source/plugin.cpp -> obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/plugin.cpp
Changed
/*
MIT License
-Copyright (c) 2016 Michael Fabian Dirks
+Copyright (c) 2016-2017
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
*/
#pragma once
-//////////////////////////////////////////////////////////////////////////
-// Includes
-//////////////////////////////////////////////////////////////////////////
-#include <windows.h>
-#include <sstream>
-#include <map>
-
-// Plugin
#include "plugin.h"
#include "api-base.h"
#include "amf.h"
#include "amf-capabilities.h"
+#include <sstream>
+
+#ifdef WITH_AVC
#include "enc-h264.h"
+#endif
+#ifdef WITH_HEVC
+#include "enc-h265.h"
+#endif
+#ifdef _WIN32
+#include <windows.h>
+#endif
+
+using namespace Plugin;
+using namespace Plugin::AMD;
//////////////////////////////////////////////////////////////////////////
// Code
OBS_MODULE_AUTHOR("Michael Fabian Dirks");
OBS_MODULE_USE_DEFAULT_LOCALE("enc-amf", "en-US");
+#ifdef _DEBUG
+#include "components/VideoEncoderVCE.h"
+#include "components/VideoEncoderHEVC.h"
+
+static std::string fastPrintVariant(const char* text, amf::AMFVariantStruct variant) {
+ std::vector<char> buf(1024);
+ switch (variant.type) {
+ case amf::AMF_VARIANT_EMPTY:
+ snprintf(buf.data(), buf.size(), "%s%s", text, "Empty");
+ break;
+ case amf::AMF_VARIANT_BOOL:
+ snprintf(buf.data(), buf.size(), "%s%s", text, variant.boolValue ? "true" : "false");
+ break;
+ case amf::AMF_VARIANT_INT64:
+ snprintf(buf.data(), buf.size(), "%s%lld", text, variant.int64Value);
+ break;
+ case amf::AMF_VARIANT_DOUBLE:
+ snprintf(buf.data(), buf.size(), "%s%f", text, variant.doubleValue);
+ break;
+ case amf::AMF_VARIANT_RECT:
+ snprintf(buf.data(), buf.size(), "%s[%ld,%ld,%ld,%ld]", text,
+ variant.rectValue.top, variant.rectValue.left,
+ variant.rectValue.bottom, variant.rectValue.right);
+ break;
+ case amf::AMF_VARIANT_SIZE:
+ snprintf(buf.data(), buf.size(), "%s%ldx%ld", text,
+ variant.sizeValue.width, variant.sizeValue.height);
+ break;
+ case amf::AMF_VARIANT_POINT:
+ snprintf(buf.data(), buf.size(), "%s[%ld,%ld]", text,
+ variant.pointValue.x, variant.pointValue.y);
+ break;
+ case amf::AMF_VARIANT_RATE:
+ snprintf(buf.data(), buf.size(), "%s%ld/%ld", text,
+ variant.rateValue.num, variant.rateValue.den);
+ break;
+ case amf::AMF_VARIANT_RATIO:
+ snprintf(buf.data(), buf.size(), "%s%ld:%ld", text,
+ variant.ratioValue.num, variant.ratioValue.den);
+ break;
+ case amf::AMF_VARIANT_COLOR:
+ snprintf(buf.data(), buf.size(), "%s(%d,%d,%d,%d)", text,
+ variant.colorValue.r,
+ variant.colorValue.g,
+ variant.colorValue.b,
+ variant.colorValue.a);
+ break;
+ case amf::AMF_VARIANT_STRING:
+ snprintf(buf.data(), buf.size(), "%s'%s'", text,
+ variant.stringValue);
+ break;
+ case amf::AMF_VARIANT_WSTRING:
+ snprintf(buf.data(), buf.size(), "%s'%ls'", text,
+ variant.wstringValue);
+ break;
+ }
+ return std::string(buf.data());
+};
+
+static void printDebugInfo(amf::AMFComponentPtr m_AMFEncoder) {
+ amf::AMFPropertyInfo* pInfo;
+ size_t propCount = m_AMFEncoder->GetPropertyCount();
+ PLOG_INFO("-- Internal AMF Encoder Properties --");
+ for (size_t propIndex = 0; propIndex < propCount; propIndex++) {
+ static const char* typeToString[] = {
+ "Empty",
+ "Boolean",
+ "Int64",
+ "Double",
+ "Rect",
+ "Size",
+ "Point",
+ "Rate",
+ "Ratio",
+ "Color",
+ "String",
+ "WString",
+ "Interface"
+ };
+
+ AMF_RESULT res = m_AMFEncoder->GetPropertyInfo(propIndex, (const amf::AMFPropertyInfo**) &pInfo);
+ if (res != AMF_OK)
+ continue;
+
+ amf::AMFVariantStruct curStruct = amf::AMFVariantStruct();
+ m_AMFEncoder->GetProperty(pInfo->name, &curStruct);
+
+ auto vcur = fastPrintVariant("Current: ", curStruct);
+ auto vdef = fastPrintVariant("Default: ", pInfo->defaultValue);
+ auto vmin = fastPrintVariant("Minimum: ", pInfo->minValue);
+ auto vmax = fastPrintVariant("Maximum: ", pInfo->maxValue);
+ std::stringstream venum;
+ if (pInfo->pEnumDescription) {
+ const amf::AMFEnumDescriptionEntry* pEnumEntry = pInfo->pEnumDescription;
+ while (pEnumEntry->name != nullptr) {
+ QUICK_FORMAT_MESSAGE(tmp, "%ls[%ld]", pEnumEntry->name, pEnumEntry->value);
+ venum << tmp.c_str() << "; ";
+ pEnumEntry++;
+ }
+ }
+
+ PLOG_INFO("%ls(Description: %ls, Type: %s, Index %d, Content Type: %d, Access: %s%s%s, Values: {%s, %s, %s, %s%s%s})",
+ pInfo->name,
+ pInfo->desc,
+ typeToString[pInfo->type],
+ propIndex,
+ pInfo->contentType,
+ (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_READ) ? "R" : "",
+ (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_WRITE) ? "W" : "",
+ (pInfo->accessType & amf::AMF_PROPERTY_ACCESS_WRITE_RUNTIME) ? "X" : "",
+ vcur.c_str(), vdef.c_str(), vmin.c_str(), vmax.c_str(),
+ (venum.str().length() > 0) ? ", Enum: " : "", venum.str().c_str()
+ );
+ }
+}
+#endif
+
/**
* Required: Called when the module is loaded. Use this function to load all
* the sources/encoders/outputs/services for your module, or anything else that
* false to indicate failure and unload the module
*/
MODULE_EXPORT bool obs_module_load(void) {
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Loading...");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Loading...");
- // Attempt to load AMF Runtime
+ // AMF
try {
- Plugin::AMD::AMF::GetInstance();
- } catch (std::exception& e) {
- AMF_LOG_ERROR("%s", e.what());
- return true;
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- return true;
+ Plugin::AMD::AMF::Initialize();
+ } catch (const std::exception& e) {
+ PLOG_ERROR("Encountered Exception during AMF initialization: %s", e.what());
+ return false;
} catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- return true;
+ PLOG_ERROR("Unexpected Exception during AMF initialization.");
+ return false;
}
// Initialize Graphics APIs
+ Plugin::API::InitializeAPIs();
+
+ // AMF Capabilities
try {
- Plugin::API::Base::Initialize();
- } catch (std::exception& e) {
- AMF_LOG_ERROR("%s", e.what());
- return true;
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- return true;
+ Plugin::AMD::CapabilityManager::Initialize();
+ } catch (const std::exception& e) {
+ PLOG_ERROR("Encountered Exception during Capability Manager initialization: %s", e.what());
+ return false;
} catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- return true;
+ PLOG_ERROR("Unexpected Exception during Capability Manager initialization.");
+ return false;
}
- // Register Encoder
- try {
- Plugin::Interface::H264Interface::encoder_register();
- } catch (std::exception& e) {
- AMF_LOG_ERROR("%s", e.what());
- return true;
- } catch (std::exception* e) {
- AMF_LOG_ERROR("%s", e->what());
- delete e;
- return true;
- } catch (...) {
- AMF_LOG_ERROR("Unknown Exception.");
- return true;
+ // Register Encoders
+ #ifdef WITH_AVC
+ Plugin::Interface::H264Interface::encoder_register();
+ #endif
+ #ifdef WITH_HEVC
+ Plugin::Interface::H265Interface::encoder_register();
+ #endif
+
+ #ifdef _DEBUG
+ {
+ PLOG_INFO("Dumping Parameter Information...");
+ const wchar_t* encoders[] = {
+ AMFVideoEncoderVCE_AVC,
+ AMFVideoEncoder_HEVC
+ };
+ auto m_AMF = AMF::Instance();
+ auto m_AMFFactory = m_AMF->GetFactory();
+ amf::AMFTrace* m_AMFTrace;
+ m_AMFFactory->GetTrace(&m_AMFTrace);
+ amf::AMFDebug* m_AMFDebug;
+ m_AMFFactory->GetDebug(&m_AMFDebug);
+ m_AMFDebug->AssertsEnable(true);
+ m_AMFDebug->EnablePerformanceMonitor(true);
+ m_AMFTrace->EnableWriter(AMF_TRACE_WRITER_FILE, true);
+ m_AMFTrace->EnableWriter(AMF_TRACE_WRITER_DEBUG_OUTPUT, true);
+ m_AMFTrace->SetWriterLevel(AMF_TRACE_WRITER_FILE, 99);
+ m_AMFTrace->SetWriterLevel(AMF_TRACE_WRITER_DEBUG_OUTPUT, 99);
+ m_AMFTrace->SetPath(L"C:\\AMFTrace.log");
+ m_AMFTrace->TraceEnableAsync(true);
+ m_AMFTrace->SetGlobalLevel(99);
+ for (auto enc : encoders) {
+ amf::AMFContextPtr m_AMFContext;
+ if (m_AMFFactory->CreateContext(&m_AMFContext) == AMF_OK) {
+ m_AMFContext->InitDX11(nullptr);
+ amf::AMFComponentPtr m_AMFComponent;
+ if (m_AMFFactory->CreateComponent(m_AMFContext, enc, &m_AMFComponent) == AMF_OK) {
+ PLOG_INFO("-- %ls --", enc);
+ printDebugInfo(m_AMFComponent);
+ m_AMFComponent->Terminate();
+ }
+ m_AMFContext->Terminate();
+ }
+ }
}
+ #endif
- AMF_LOG_DEBUG("<" __FUNCTION_NAME__ "> Complete.");
+ PLOG_DEBUG("<" __FUNCTION_NAME__ "> Loaded.");
return true;
}
/** Optional: Called when the module is unloaded. */
-MODULE_EXPORT void obs_module_unload(void) {}
+MODULE_EXPORT void obs_module_unload(void) {
+ Plugin::AMD::CapabilityManager::Finalize();
+ Plugin::API::FinalizeAPIs();
+ Plugin::AMD::AMF::Finalize();
+}
/** Optional: Returns the full name of the module */
MODULE_EXPORT const char* obs_module_name() {
return "AMD Media Framework Plugin";
}
-// Allow translation strings to reference other translation strings up to a certain depth.
-static std::map<std::string, std::string> translatedMap;
-const char *obs_module_text_multi(const char *key, uint8_t depth) {
- // Check if it already was translated.
- if (!translatedMap.count(std::string(key))) { // If not, translate it now.
- const char* out = obs_module_text(key);
-
- // Allow for nested translations using \@...\@ sequences.
- if (depth > 0) {
- // I'm pretty sure this can be optimized a ton if necessary.
-
- size_t seqStart = 0,
- seqEnd = 0;
- bool haveSequence = false;
-
- std::stringstream fout;
-
- // Walk the given string.
- std::string walkable = std::string(out);
-
- for (size_t pos = 0; pos <= walkable.length(); pos++) {
- std::string walked = walkable.substr(pos, 2);
-
- if (walked == "\\@") { // Sequence Start/End
- if (haveSequence) {
- seqEnd = pos;
-
- std::string sequence = walkable.substr(seqStart, seqEnd - seqStart);
- fout << obs_module_text_multi(sequence.c_str(), depth--);
- } else {
- seqStart = pos + 2;
- }
- haveSequence = !haveSequence;
- pos = pos + 2;
- } else if (!haveSequence) {
- fout << walked.substr(0, 1); // Append the left character.
- }
- }
-
- std::pair<std::string, std::string> kv = std::pair<std::string, std::string>(std::string(key), fout.str());
- translatedMap.insert(kv);
- } else {
- return out;
- }
- }
-
- auto value = translatedMap.find(std::string(key));
- return value->second.c_str();
-}
-
-//////////////////////////////////////////////////////////////////////////
-// Threading Specific
-//////////////////////////////////////////////////////////////////////////
-
-#if (defined _WIN32) || (defined _WIN64) // Windows
-#include <windows.h>
-
-const DWORD MS_VC_EXCEPTION = 0x406D1388;
-
-#pragma pack(push,8)
-typedef struct tagTHREADNAME_INFO {
- DWORD dwType; // Must be 0x1000.
- LPCSTR szName; // Pointer to name (in user addr space).
- DWORD dwThreadID; // Thread ID (-1=caller thread).
- DWORD dwFlags; // Reserved for future use, must be zero.
-} THREADNAME_INFO;
-#pragma pack(pop)
-
-void SetThreadName(uint32_t dwThreadID, const char* threadName) {
-
- // DWORD dwThreadID = ::GetThreadId( static_cast<HANDLE>( t.native_handle() ) );
-
- THREADNAME_INFO info;
- info.dwType = 0x1000;
- info.szName = threadName;
- info.dwThreadID = dwThreadID;
- info.dwFlags = 0;
-
- __try {
- RaiseException(MS_VC_EXCEPTION, 0, sizeof(info) / sizeof(ULONG_PTR), (ULONG_PTR*)&info);
- } __except (EXCEPTION_EXECUTE_HANDLER) {}
-}
-void SetThreadName(const char* threadName) {
- SetThreadName(GetCurrentThreadId(), threadName);
-}
-void SetThreadName(std::thread* thread, const char* threadName) {
- DWORD threadId = ::GetThreadId(static_cast<HANDLE>(thread->native_handle()));
- SetThreadName(threadId, threadName);
-}
-
-#else // Linux, Mac
-#include <sys/prctl.h>
-
-void SetThreadName(std::thread* thread, const char* threadName) {
- auto handle = thread->native_handle();
- pthread_setname_np(handle, threadName);
-}
-void SetThreadName(const char* threadName) {
- prctl(PR_SET_NAME, threadName, 0, 0, 0);
-}
-
-#endif
\ No newline at end of file
obs-studio-18.0.2.tar.xz/plugins/enc-amf/Source/utility.cpp
Added
+/*
+MIT License
+
+Copyright (c) 2016-2017
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+#pragma once
+
+// Plugin
+#include "utility.h"
+#include "amf.h"
+#include "amf-encoder.h"
+#include "components/VideoConverter.h"
+#ifdef WITH_AVC
+#include "amf-encoder-h264.h"
+#include "components/VideoEncoderVCE.h"
+#endif
+#ifdef WITH_HEVC
+#include "amf-encoder-h265.h"
+#include "components/VideoEncoderHEVC.h"
+#endif
+
+#include <map>
+#include <sstream>
+
+using namespace Plugin::AMD;
+
+uint64_t Utility::GetUniqueIdentifier() {
+ static std::mutex __mutex;
+ static uint64_t __curId;
+
+ const std::lock_guard<std::mutex> lock(__mutex);
+ return ++__curId;
+}
+
+static std::map<std::string, std::string> translatedMap;
+const char* Utility::obs_module_text_multi(const char *key, uint8_t depth) {
+ // Check if it already was translated.
+ if (!translatedMap.count(std::string(key))) { // If not, translate it now.
+ const char* out = obs_module_text(key);
+
+ // Allow for nested translations using \@...\@ sequences.
+ if (depth > 0) {
+ // I'm pretty sure this can be optimized a ton if necessary.
+
+ size_t seqStart = 0,
+ seqEnd = 0;
+ bool haveSequence = false;
+
+ std::stringstream fout;
+
+ // Walk the given string.
+ std::string walkable = std::string(out);
+
+ for (size_t pos = 0; pos <= walkable.length(); pos++) {
+ std::string walked = walkable.substr(pos, 2);
+
+ if (walked == "\\@") { // Sequence Start/End
+ if (haveSequence) {
+ seqEnd = pos;
+
+ std::string sequence = walkable.substr(seqStart, seqEnd - seqStart);
+ fout << obs_module_text_multi(sequence.c_str(), depth--);
+ } else {
+ seqStart = pos + 2;
+ }
+ haveSequence = !haveSequence;
+ pos = pos + 1;
+ } else if (!haveSequence) {
+ fout << walked.substr(0, 1); // Append the left character.
+ }
+ }
+
+ std::pair<std::string, std::string> kv = std::pair<std::string, std::string>(std::string(key), fout.str());
+ translatedMap.insert(kv);
+ } else {
+ return out;
+ }
+ }
+
+ auto value = translatedMap.find(std::string(key));
+ return value->second.c_str();
+}
+
+
+// Codec
+const char* Utility::CodecToString(Plugin::AMD::Codec v) {
+ switch (v) {
+ #ifdef WITH_AVC
+ case Codec::AVC:
+ return "H264/AVC";
+ case Codec::SVC:
+ return "H264/SVC";
+ #endif
+ #ifdef WITH_HEVC
+ case Codec::HEVC:
+ return "H265/HEVC";
+ #endif
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+const wchar_t* Utility::CodecToAMF(Plugin::AMD::Codec v) {
+ switch (v) {
+ #ifdef WITH_AVC
+ case Codec::AVC:
+ return AMFVideoEncoderVCE_AVC;
+ case Codec::SVC:
+ return AMFVideoEncoderVCE_SVC;
+ #endif
+ #ifdef WITH_HEVC
+ case Codec::HEVC:
+ return AMFVideoEncoder_HEVC;
+ #endif
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+
+// Color Format
+const char* Utility::ColorFormatToString(Plugin::AMD::ColorFormat v) {
+ switch (v) {
+ case ColorFormat::I420:
+ return "YUV 4:2:0";
+ case ColorFormat::NV12:
+ return "NV12";
+ case ColorFormat::YUY2:
+ return "YUY2";
+ case ColorFormat::BGRA:
+ return "BGRA";
+ case ColorFormat::RGBA:
+ return "RGBA";
+ case ColorFormat::GRAY:
+ return "GRAY";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+amf::AMF_SURFACE_FORMAT Utility::ColorFormatToAMF(Plugin::AMD::ColorFormat v) {
+ switch (v) {
+ case ColorFormat::I420:
+ return amf::AMF_SURFACE_YUV420P;
+ case ColorFormat::NV12:
+ return amf::AMF_SURFACE_NV12;
+ case ColorFormat::YUY2:
+ return amf::AMF_SURFACE_YUY2;
+ case ColorFormat::BGRA:
+ return amf::AMF_SURFACE_BGRA;
+ case ColorFormat::RGBA:
+ return amf::AMF_SURFACE_RGBA;
+ case ColorFormat::GRAY:
+ return amf::AMF_SURFACE_GRAY8;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+
+// Color Space
+const char* Utility::ColorSpaceToString(Plugin::AMD::ColorSpace v) {
+ switch (v) {
+ case ColorSpace::BT601:
+ return "601";
+ case ColorSpace::BT709:
+ return "709";
+ case ColorSpace::BT2020:
+ return "2020";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+AMF_VIDEO_CONVERTER_COLOR_PROFILE_ENUM Utility::ColorSpaceToAMFConverter(Plugin::AMD::ColorSpace v) {
+ switch (v) {
+ case ColorSpace::BT601:
+ return AMF_VIDEO_CONVERTER_COLOR_PROFILE_601;
+ case ColorSpace::BT709:
+ return AMF_VIDEO_CONVERTER_COLOR_PROFILE_709;
+ case ColorSpace::BT2020:
+ return AMF_VIDEO_CONVERTER_COLOR_PROFILE_2020;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+
+// Usage
+const char* Utility::UsageToString(Plugin::AMD::Usage v) {
+ switch (v) {
+ case Usage::Transcoding:
+ return "Transcoding";
+ case Usage::UltraLowLatency:
+ return "Ultra Low Latency";
+ case Usage::LowLatency:
+ return "Low Latency";
+ case Usage::Webcam:
+ return "Webcam";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_USAGE_ENUM Utility::UsageToAMFH264(Plugin::AMD::Usage v) {
+ switch (v) {
+ case Usage::Transcoding:
+ return AMF_VIDEO_ENCODER_USAGE_TRANSCONDING;
+ case Usage::UltraLowLatency:
+ return AMF_VIDEO_ENCODER_USAGE_ULTRA_LOW_LATENCY;
+ case Usage::LowLatency:
+ return AMF_VIDEO_ENCODER_USAGE_LOW_LATENCY;
+ case Usage::Webcam:
+ return AMF_VIDEO_ENCODER_USAGE_WEBCAM;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::Usage Utility::UsageFromAMFH264(AMF_VIDEO_ENCODER_USAGE_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_USAGE_TRANSCONDING:
+ return Plugin::AMD::Usage::Transcoding;
+ case AMF_VIDEO_ENCODER_USAGE_ULTRA_LOW_LATENCY:
+ return Plugin::AMD::Usage::UltraLowLatency;
+ case AMF_VIDEO_ENCODER_USAGE_LOW_LATENCY:
+ return Plugin::AMD::Usage::LowLatency;
+ case AMF_VIDEO_ENCODER_USAGE_WEBCAM:
+ return Plugin::AMD::Usage::Webcam;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+#ifdef WITH_HEVC
+AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM Utility::UsageToAMFH265(Plugin::AMD::Usage v) {
+ switch (v) {
+ case Usage::Transcoding:
+ return AMF_VIDEO_ENCODER_HEVC_USAGE_TRANSCONDING;
+ case Usage::UltraLowLatency:
+ return AMF_VIDEO_ENCODER_HEVC_USAGE_ULTRA_LOW_LATENCY;
+ case Usage::LowLatency:
+ return AMF_VIDEO_ENCODER_HEVC_USAGE_LOW_LATENCY;
+ case Usage::Webcam:
+ return AMF_VIDEO_ENCODER_HEVC_USAGE_WEBCAM;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::Usage Utility::UsageFromAMFH265(AMF_VIDEO_ENCODER_HEVC_USAGE_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_HEVC_USAGE_TRANSCONDING:
+ return Usage::Transcoding;
+ case AMF_VIDEO_ENCODER_HEVC_USAGE_ULTRA_LOW_LATENCY:
+ return Usage::UltraLowLatency;
+ case AMF_VIDEO_ENCODER_HEVC_USAGE_LOW_LATENCY:
+ return Usage::LowLatency;
+ case AMF_VIDEO_ENCODER_HEVC_USAGE_WEBCAM:
+ return Usage::Webcam;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Quality Preset
+const char* Utility::QualityPresetToString(Plugin::AMD::QualityPreset v) {
+ switch (v) {
+ case QualityPreset::Speed:
+ return "Speed";
+ case QualityPreset::Balanced:
+ return "Balanced";
+ case QualityPreset::Quality:
+ return "Quality";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM Utility::QualityPresetToAMFH264(Plugin::AMD::QualityPreset v) {
+ switch (v) {
+ case QualityPreset::Speed:
+ return AMF_VIDEO_ENCODER_QUALITY_PRESET_SPEED;
+ case QualityPreset::Balanced:
+ return AMF_VIDEO_ENCODER_QUALITY_PRESET_BALANCED;
+ case QualityPreset::Quality:
+ return AMF_VIDEO_ENCODER_QUALITY_PRESET_QUALITY;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::QualityPreset Utility::QualityPresetFromAMFH264(AMF_VIDEO_ENCODER_QUALITY_PRESET_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_QUALITY_PRESET_SPEED:
+ return QualityPreset::Speed;
+ case AMF_VIDEO_ENCODER_QUALITY_PRESET_BALANCED:
+ return QualityPreset::Balanced;
+ case AMF_VIDEO_ENCODER_QUALITY_PRESET_QUALITY:
+ return QualityPreset::Quality;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+#ifdef WITH_HEVC
+AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM Utility::QualityPresetToAMFH265(Plugin::AMD::QualityPreset v) {
+ switch (v) {
+ case QualityPreset::Speed:
+ return AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_SPEED;
+ case QualityPreset::Balanced:
+ return AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_BALANCED;
+ case QualityPreset::Quality:
+ return AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_QUALITY;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::QualityPreset Utility::QualityPresetFromAMFH265(AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_SPEED:
+ return QualityPreset::Speed;
+ case AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_BALANCED:
+ return QualityPreset::Balanced;
+ case AMF_VIDEO_ENCODER_HEVC_QUALITY_PRESET_QUALITY:
+ return QualityPreset::Quality;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Profile
+const char* Utility::ProfileToString(Plugin::AMD::Profile v) {
+ switch (v) {
+ case Profile::ConstrainedBaseline:
+ return "Constrained Baseline";
+ case Profile::Baseline:
+ return "Baseline";
+ case Profile::Main:
+ return "Main";
+ case Profile::ConstrainedHigh:
+ return "Constrained High";
+ case Profile::High:
+ return "High";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_PROFILE_ENUM Utility::ProfileToAMFH264(Plugin::AMD::Profile v) {
+ switch (v) {
+ case Profile::ConstrainedBaseline:
+ return (AMF_VIDEO_ENCODER_PROFILE_ENUM)256;
+ case Profile::Baseline:
+ return AMF_VIDEO_ENCODER_PROFILE_BASELINE;
+ case Profile::Main:
+ return AMF_VIDEO_ENCODER_PROFILE_MAIN;
+ case Profile::ConstrainedHigh:
+ return (AMF_VIDEO_ENCODER_PROFILE_ENUM)257;
+ case Profile::High:
+ return AMF_VIDEO_ENCODER_PROFILE_HIGH;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::Profile Utility::ProfileFromAMFH264(AMF_VIDEO_ENCODER_PROFILE_ENUM v) {
+ #pragma warning( disable: 4063 ) // Developer Note: I know better, Compiler.
+ switch (v) {
+ case (AMF_VIDEO_ENCODER_PROFILE_ENUM)256:
+ return Profile::ConstrainedBaseline;
+ case AMF_VIDEO_ENCODER_PROFILE_BASELINE:
+ return Profile::Baseline;
+ case AMF_VIDEO_ENCODER_PROFILE_MAIN:
+ return Profile::Main;
+ case (AMF_VIDEO_ENCODER_PROFILE_ENUM)257:
+ return Profile::ConstrainedHigh;
+ case AMF_VIDEO_ENCODER_PROFILE_HIGH:
+ return Profile::High;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+#ifdef WITH_HEVC
+AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM Utility::ProfileToAMFH265(Plugin::AMD::Profile v) {
+ switch (v) {
+ case Profile::Main:
+ return AMF_VIDEO_ENCODER_HEVC_PROFILE_MAIN;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::Profile Utility::ProfileFromAMFH265(AMF_VIDEO_ENCODER_HEVC_PROFILE_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_HEVC_PROFILE_MAIN:
+ return Profile::Main;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+#ifdef WITH_HEVC
+// Tier
+const char* Utility::TierToString(Plugin::AMD::H265::Tier v) {
+ switch (v) {
+ case H265::Tier::Main:
+ return "Main";
+ case H265::Tier::High:
+ return "High";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+AMF_VIDEO_ENCODER_HEVC_TIER_ENUM Utility::TierToAMFH265(Plugin::AMD::H265::Tier v) {
+ switch (v) {
+ case H265::Tier::Main:
+ return AMF_VIDEO_ENCODER_HEVC_TIER_MAIN;
+ case H265::Tier::High:
+ return AMF_VIDEO_ENCODER_HEVC_TIER_HIGH;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::H265::Tier Utility::TierFromAMFH265(AMF_VIDEO_ENCODER_HEVC_TIER_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_HEVC_TIER_MAIN:
+ return H265::Tier::Main;
+ case AMF_VIDEO_ENCODER_HEVC_TIER_HIGH:
+ return H265::Tier::High;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Coding Type
+const char* Utility::CodingTypeToString(Plugin::AMD::CodingType v) {
+ switch (v) {
+ case CodingType::Automatic:
+ return "Automatic";
+ case CodingType::CALVC:
+ return "CALVC";
+ case CodingType::CABAC:
+ return "CABAC";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_CODING_ENUM Utility::CodingTypeToAMFH264(Plugin::AMD::CodingType v) {
+ switch (v) {
+ case CodingType::Automatic:
+ return AMF_VIDEO_ENCODER_UNDEFINED;
+ case CodingType::CALVC:
+ return AMF_VIDEO_ENCODER_CALV;
+ case CodingType::CABAC:
+ return AMF_VIDEO_ENCODER_CABAC;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::CodingType Utility::CodingTypeFromAMFH264(AMF_VIDEO_ENCODER_CODING_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_UNDEFINED:
+ return CodingType::Automatic;
+ case AMF_VIDEO_ENCODER_CALV:
+ return CodingType::CALVC;
+ case AMF_VIDEO_ENCODER_CABAC:
+ return CodingType::CABAC;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+#ifdef WITH_HEVC
+int64_t Utility::CodingTypeToAMFH265(Plugin::AMD::CodingType v) {
+ switch (v) {
+ case CodingType::Automatic:
+ return 0;
+ case CodingType::CABAC:
+ return 1;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::CodingType Utility::CodingTypeFromAMFH265(int64_t v) {
+ switch (v) {
+ case 0:
+ return CodingType::Automatic;
+ case 1:
+ return CodingType::CABAC;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Rate Control Method
+const char* Utility::RateControlMethodToString(Plugin::AMD::RateControlMethod v) {
+ switch (v) {
+ case RateControlMethod::ConstantQP:
+ return "Constant Quantization Parameter";
+ case RateControlMethod::ConstantBitrate:
+ return "Constant Bitrate";
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ return "Peak Constrained Variable Bitrate";
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ return "Latency Constrained Variable Bitrate";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM Utility::RateControlMethodToAMFH264(Plugin::AMD::RateControlMethod v) {
+ switch (v) {
+ case RateControlMethod::ConstantQP:
+ return AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CONSTANT_QP;
+ case RateControlMethod::ConstantBitrate:
+ return AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CBR;
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ return AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_PEAK_CONSTRAINED_VBR;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ return AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_LATENCY_CONSTRAINED_VBR;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::RateControlMethod Utility::RateControlMethodFromAMFH264(AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CONSTANT_QP:
+ return RateControlMethod::ConstantQP;
+ case AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_CBR:
+ return RateControlMethod::ConstantBitrate;
+ case AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_PEAK_CONSTRAINED_VBR:
+ return RateControlMethod::PeakConstrainedVariableBitrate;
+ case AMF_VIDEO_ENCODER_RATE_CONTROL_METHOD_LATENCY_CONSTRAINED_VBR:
+ return RateControlMethod::LatencyConstrainedVariableBitrate;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+#ifdef WITH_HEVC
+AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM Utility::RateControlMethodToAMFH265(Plugin::AMD::RateControlMethod v) {
+ switch (v) {
+ case RateControlMethod::ConstantQP:
+ return AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_CONSTANT_QP;
+ case RateControlMethod::ConstantBitrate:
+ return AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_CBR;
+ case RateControlMethod::PeakConstrainedVariableBitrate:
+ return AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_PEAK_CONSTRAINED_VBR;
+ case RateControlMethod::LatencyConstrainedVariableBitrate:
+ return AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_LATENCY_CONSTRAINED_VBR;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::RateControlMethod Utility::RateControlMethodFromAMFH265(AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_CONSTANT_QP:
+ return RateControlMethod::ConstantQP;
+ case AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_CBR:
+ return RateControlMethod::ConstantBitrate;
+ case AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_PEAK_CONSTRAINED_VBR:
+ return RateControlMethod::PeakConstrainedVariableBitrate;
+ case AMF_VIDEO_ENCODER_HEVC_RATE_CONTROL_METHOD_LATENCY_CONSTRAINED_VBR:
+ return RateControlMethod::LatencyConstrainedVariableBitrate;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Pre-Pass Method
+const char* Utility::PrePassModeToString(Plugin::AMD::PrePassMode v) {
+ switch (v) {
+ case PrePassMode::Disabled:
+ return "Disabled";
+ case PrePassMode::Enabled:
+ return "Enabled (Full Scale)";
+ case PrePassMode::EnabledAtHalfScale:
+ return "Enabled (Half Scale)";
+ case PrePassMode::EnabledAtQuarterScale:
+ return "Enabled (Quarter Scale)";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#ifdef WITH_AVC
+AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM Utility::PrePassModeToAMFH264(Plugin::AMD::PrePassMode v) {
+ switch (v) {
+ case PrePassMode::Disabled:
+ return AMF_VIDEO_ENCODER_PREENCODE_DISABLED;
+ case PrePassMode::Enabled:
+ return AMF_VIDEO_ENCODER_PREENCODE_ENABLED;
+ case PrePassMode::EnabledAtHalfScale:
+ return AMF_VIDEO_ENCODER_PREENCODE_ENABLED_DOWNSCALEFACTOR_2;
+ case PrePassMode::EnabledAtQuarterScale:
+ return AMF_VIDEO_ENCODER_PREENCODE_ENABLED_DOWNSCALEFACTOR_4;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::PrePassMode Utility::PrePassModeFromAMFH264(AMF_VIDEO_ENCODER_PREENCODE_MODE_ENUM v) {
+ switch (v) {
+ case AMF_VIDEO_ENCODER_PREENCODE_DISABLED:
+ return PrePassMode::Disabled;
+ case AMF_VIDEO_ENCODER_PREENCODE_ENABLED:
+ return PrePassMode::Enabled;
+ case AMF_VIDEO_ENCODER_PREENCODE_ENABLED_DOWNSCALEFACTOR_2:
+ return PrePassMode::EnabledAtHalfScale;
+ case AMF_VIDEO_ENCODER_PREENCODE_ENABLED_DOWNSCALEFACTOR_4:
+ return PrePassMode::EnabledAtQuarterScale;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// GOP Type
+#ifdef WITH_HEVC
+const char* Utility::GOPTypeToString(Plugin::AMD::H265::GOPType v) {
+ switch (v) {
+ case H265::GOPType::Fixed:
+ return "Fixed";
+ case H265::GOPType::Variable:
+ return "Variable";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+Plugin::AMD::H265::GOPType Utility::GOPTypeFromAMFH265(int64_t v) {
+ switch (v) {
+ case 0:
+ return H265::GOPType::Fixed;
+ case 1:
+ return H265::GOPType::Variable;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+int64_t Utility::GOPTypeToAMFH265(Plugin::AMD::H265::GOPType v) {
+ switch (v) {
+ case H265::GOPType::Fixed:
+ return 0;
+ case H265::GOPType::Variable:
+ return 1;
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+
+// Slicing
+#ifdef WITH_AVC
+const char* Utility::SliceModeToString(Plugin::AMD::H264::SliceMode v) {
+ switch (v) {
+ case H264::SliceMode::Row:
+ return "Row";
+ case H264::SliceMode::Column:
+ return "Column";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+#endif
+const char* Utility::SliceControlModeToString(Plugin::AMD::SliceControlMode v) {
+ switch (v) {
+ case SliceControlMode::Unknown0:
+ return "Unknown 0";
+ case SliceControlMode::Unknown1:
+ return "Unknown 1";
+ case SliceControlMode::Unknown2:
+ return "Unknown 2";
+ case SliceControlMode::Unknown3:
+ return "Unknown 3";
+ }
+ throw std::runtime_error("Invalid Parameter");
+}
+
+#ifdef WITH_AVC
+Plugin::AMD::ProfileLevel Utility::H264ProfileLevel(
+ std::pair<uint32_t, uint32_t> resolution,
+ std::pair<uint32_t, uint32_t> frameRate) {
+ typedef std::pair<uint32_t, uint32_t> levelRestriction;
+ typedef std::pair<ProfileLevel, levelRestriction> level;
+
+ static const level profileLevelLimit[] = { // [Level, [Samples, Samples_Per_Sec]]
+ level(ProfileLevel::L10, levelRestriction(25344, 380160)),
+ level(ProfileLevel::L11, levelRestriction(101376, 768000)),
+ level(ProfileLevel::L12, levelRestriction(101376, 1536000)),
+ level(ProfileLevel::L13, levelRestriction(101376, 3041280)),
+ level(ProfileLevel::L20, levelRestriction(101376, 3041280)),
+ level(ProfileLevel::L21, levelRestriction(202752, 5068800)),
+ level(ProfileLevel::L22, levelRestriction(414720, 5184000)),
+ level(ProfileLevel::L30, levelRestriction(414720, 10368000)),
+ level(ProfileLevel::L31, levelRestriction(921600, 27648000)),
+ level(ProfileLevel::L32, levelRestriction(1310720, 55296000)),
+ //level(H264ProfileLevel::40, levelRestriction(2097152, 62914560)), // Technically identical to 4.1, but backwards compatible.
+ level(ProfileLevel::L41, levelRestriction(2097152, 62914560)),
+ level(ProfileLevel::L42, levelRestriction(2228224, 133693440)),
+ level(ProfileLevel::L50, levelRestriction(5652480, 150994944)),
+ level(ProfileLevel::L51, levelRestriction(9437184, 251658240)),
+ level(ProfileLevel::L52, levelRestriction(9437184, 530841600)),
+ level((ProfileLevel)-1, levelRestriction(0, 0))
+ };
+
+ uint32_t samples = resolution.first * resolution.second;
+ uint32_t samples_sec = (uint32_t)ceil((double_t)samples * ((double_t)frameRate.first / (double_t)frameRate.second));
+
+ level curLevel = profileLevelLimit[0];
+ for (uint32_t index = 0; (int32_t)curLevel.first != -1; index++) {
+ curLevel = profileLevelLimit[index];
+
+ if (samples > curLevel.second.first)
+ continue;
+
+ if (samples_sec > curLevel.second.second)
+ continue;
+
+ return curLevel.first;
+ }
+ return ProfileLevel::L52;
+}
+#endif
+#ifdef WITH_HEVC
+Plugin::AMD::ProfileLevel Utility::H265ProfileLevel(
+ std::pair<uint32_t, uint32_t> resolution,
+ std::pair<uint32_t, uint32_t> frameRate) {
+ typedef std::pair<uint32_t, uint32_t> levelRestriction; // Total, Main/Sec, High/Sec
+ typedef std::pair<ProfileLevel, levelRestriction> level;
+
+ static const level profileLevelLimit[] = { // [Level, [Samples, Samples_Per_Sec]]
+ level(ProfileLevel::L10, levelRestriction(36864, 552960)),
+ level(ProfileLevel::L20, levelRestriction(122880, 3686400)),
+ level(ProfileLevel::L21, levelRestriction(245760, 7372800)),
+ level(ProfileLevel::L30, levelRestriction(552960, 16588800)),
+ level(ProfileLevel::L31, levelRestriction(983040, 33177600)),
+ level(ProfileLevel::L40, levelRestriction(2228224, 66846720)),
+ level(ProfileLevel::L41, levelRestriction(2228224, 133693440)),
+ level(ProfileLevel::L50, levelRestriction(8912896, 267386880)),
+ level(ProfileLevel::L51, levelRestriction(8912896, 534773760)),
+ level(ProfileLevel::L52, levelRestriction(8912896, 1069547520)),
+ level(ProfileLevel::L60, levelRestriction(35651584, 1069547520)),
+ level(ProfileLevel::L61, levelRestriction(35651584, 2139095040)),
+ level(ProfileLevel::L62, levelRestriction(35651584, 4278190080)),
+ level((ProfileLevel)-1, levelRestriction(0, 0))
+ };
+
+ uint32_t samples = resolution.first * resolution.second;
+ uint32_t samples_sec = (uint32_t)ceil((double_t)samples * ((double_t)frameRate.first / (double_t)frameRate.second));
+
+ level curLevel = profileLevelLimit[0];
+ for (uint32_t index = 0; (int32_t)curLevel.first != -1; index++) {
+ curLevel = profileLevelLimit[index];
+
+ if (samples > curLevel.second.first)
+ continue;
+
+ if (samples_sec > curLevel.second.second)
+ continue;
+
+ return curLevel.first;
+ }
+ return ProfileLevel::L62;
+}
+#endif
+
+//////////////////////////////////////////////////////////////////////////
+// Threading Specific
+//////////////////////////////////////////////////////////////////////////
+
+#if (defined _WIN32) || (defined _WIN64) // Windows
+#include <windows.h>
+
+const DWORD MS_VC_EXCEPTION = 0x406D1388;
+
+#pragma pack(push,8)
+typedef struct tagTHREADNAME_INFO {
+ DWORD dwType; // Must be 0x1000.
+ LPCSTR szName; // Pointer to name (in user addr space).
+ DWORD dwThreadID; // Thread ID (-1=caller thread).
+ DWORD dwFlags; // Reserved for future use, must be zero.
+} THREADNAME_INFO;
+#pragma pack(pop)
+
+void Utility::SetThreadName(uint32_t dwThreadID, const char* threadName) {
+
+ // DWORD dwThreadID = ::GetThreadId( static_cast<HANDLE>( t.native_handle() ) );
+
+ THREADNAME_INFO info;
+ info.dwType = 0x1000;
+ info.szName = threadName;
+ info.dwThreadID = dwThreadID;
+ info.dwFlags = 0;
+
+ __try {
+ RaiseException(MS_VC_EXCEPTION, 0, sizeof(info) / sizeof(ULONG_PTR), (ULONG_PTR*)&info);
+ } __except (EXCEPTION_EXECUTE_HANDLER) {}
+}
+void Utility::SetThreadName(const char* threadName) {
+ Utility::SetThreadName(GetCurrentThreadId(), threadName);
+}
+void Utility::SetThreadName(std::thread* pthread, const char* threadName) {
+ DWORD threadId = ::GetThreadId(static_cast<HANDLE>(pthread->native_handle()));
+ Utility::SetThreadName(threadId, threadName);
+}
+
+#else // Linux, Mac
+#include <sys/prctl.h>
+
+void Utility::SetThreadName(std::thread* pthread, const char* threadName) {
+ auto handle = pthread->native_handle();
+ pthread_setname_np(handle, threadName);
+}
+void Utility::SetThreadName(const char* threadName) {
+ prctl(PR_SET_NAME, threadName, 0, 0, 0);
+}
+
+#endif
\ No newline at end of file
obs-studio-18.0.1.tar.xz/plugins/image-source/image-source.c -> obs-studio-18.0.2.tar.xz/plugins/image-source/image-source.c
Changed
struct image_source *context = data;
uint64_t frame_time = obs_get_video_frame_time();
+ context->update_time_elapsed += seconds;
+
+ if (context->update_time_elapsed >= 1.0f) {
+ time_t t = get_modified_timestamp(context->file);
+ context->update_time_elapsed = 0.0f;
+
+ if (context->file_timestamp != t) {
+ image_source_load(context);
+ }
+ }
+
if (obs_source_active(context->source)) {
if (!context->active) {
if (context->image.is_animated_gif)
}
context->last_time = frame_time;
-
- context->update_time_elapsed += seconds;
-
- if (context->update_time_elapsed >= 1.0f) {
- time_t t = get_modified_timestamp(context->file);
- context->update_time_elapsed = 0.0f;
-
- if (context->file_timestamp != t) {
- image_source_load(context);
- }
- }
}
obs-studio-18.0.1.tar.xz/plugins/linux-pulseaudio/pulse-input.c -> obs-studio-18.0.2.tar.xz/plugins/linux-pulseaudio/pulse-input.c
Changed
.type = OBS_SOURCE_TYPE_INPUT,
.output_flags = OBS_SOURCE_AUDIO |
OBS_SOURCE_DO_NOT_DUPLICATE |
- OBS_SOURCE_DO_NOT_MONITOR,
+ OBS_SOURCE_DO_NOT_SELF_MONITOR,
.get_name = pulse_output_getname,
.create = pulse_create,
.destroy = pulse_destroy,
obs-studio-18.0.1.tar.xz/plugins/mac-avcapture/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/mac-avcapture/CMakeLists.txt
Changed
find_library(COREMEDIA CoreMedia)
find_library(COREVIDEO CoreVideo)
find_library(COCOA Cocoa)
+find_library(COREMEDIAIO CoreMediaIO)
include_directories(${AVFOUNDATION}
${COCOA}
${COREFOUNDATION}
${COREMEDIA}
${COREVIDEO}
+ ${COREMEDIAIO}
${COCOA})
set(mac-avcapture_HEADERS
${COREFOUNDATION}
${COREMEDIA}
${COREVIDEO}
+ ${COREMEDIAIO}
${COCOA})
install_obs_plugin_with_data(mac-avcapture data)
obs-studio-18.0.1.tar.xz/plugins/mac-avcapture/av-capture.mm -> obs-studio-18.0.2.tar.xz/plugins/mac-avcapture/av-capture.mm
Changed
#import <CoreFoundation/CoreFoundation.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
+#import <CoreMediaIO/CMIOHardware.h>
#include <obs-module.h>
#include <obs.hpp>
CMMediaType mtype = CMFormatDescriptionGetMediaType(
format.formatDescription);
// TODO: support other media types
- if (mtype != kCMMediaType_Video) {
+ if (mtype != kCMMediaType_Video && mtype != kCMMediaType_Muxed) {
AVLOG(LOG_ERROR, "CMMediaType '%s' is unsupported",
AV_FOURCC_STR(mtype));
return false;
AVCaptureSessionPreset640x480,
AVCaptureSessionPreset352x288,
AVCaptureSessionPreset320x240,
- //AVCaptureSessionPresetHigh,
+ AVCaptureSessionPresetHigh,
//AVCaptureSessionPresetMedium,
//AVCaptureSessionPresetLow,
//AVCaptureSessionPresetPhoto,
AVCaptureSessionPreset640x480:@"640x480",
AVCaptureSessionPreset960x540:@"960x540",
AVCaptureSessionPreset1280x720:@"1280x720",
+ AVCaptureSessionPresetHigh:@"High",
};
NSString *name = preset_names[preset];
if (name)
TEXT_DEVICE, OBS_COMBO_TYPE_LIST,
OBS_COMBO_FORMAT_STRING);
obs_property_list_add_string(dev_list, "", "");
+
for (AVCaptureDevice *dev in [AVCaptureDevice
- devicesWithMediaType:AVMediaTypeVideo]) {
- obs_property_list_add_string(dev_list,
- dev.localizedName.UTF8String,
- dev.uniqueID.UTF8String);
+ devices]) {
+ if ([dev hasMediaType: AVMediaTypeVideo] ||
+ [dev hasMediaType: AVMediaTypeMuxed]) {
+ obs_property_list_add_string(dev_list,
+ dev.localizedName.UTF8String,
+ dev.uniqueID.UTF8String);
+ }
}
obs_property_set_modified_callback(dev_list,
bool obs_module_load(void)
{
+#ifdef __MAC_10_10
+ // Enable iOS device to show up as AVCapture devices
+ // From WWDC video 2014 #508 at 5:34
+ // https://developer.apple.com/videos/wwdc/2014/#508
+ CMIOObjectPropertyAddress prop = {
+ kCMIOHardwarePropertyAllowScreenCaptureDevices,
+ kCMIOObjectPropertyScopeGlobal,
+ kCMIOObjectPropertyElementMaster
+ };
+ UInt32 allow = 1;
+ CMIOObjectSetPropertyData(kCMIOObjectSystemObject, &prop, 0, NULL,
+ sizeof(allow), &allow);
+#endif
+
obs_source_info av_capture_info = {
.id = "av_capture_input",
.type = OBS_SOURCE_TYPE_INPUT,
obs-studio-18.0.1.tar.xz/plugins/mac-capture/mac-audio.c -> obs-studio-18.0.2.tar.xz/plugins/mac-capture/mac-audio.c
Changed
.type = OBS_SOURCE_TYPE_INPUT,
.output_flags = OBS_SOURCE_AUDIO |
OBS_SOURCE_DO_NOT_DUPLICATE |
- OBS_SOURCE_DO_NOT_MONITOR,
+ OBS_SOURCE_DO_NOT_SELF_MONITOR,
.get_name = coreaudio_output_getname,
.create = coreaudio_create_output_capture,
.destroy = coreaudio_destroy,
obs-studio-18.0.1.tar.xz/plugins/obs-ffmpeg/CMakeLists.txt -> obs-studio-18.0.2.tar.xz/plugins/obs-ffmpeg/CMakeLists.txt
Changed
${obs-ffmpeg_SOURCES})
target_link_libraries(obs-ffmpeg
libobs
- libff
+ media-playback
${obs-ffmpeg_PLATFORM_DEPS}
${FFMPEG_LIBRARIES})
obs-studio-18.0.1.tar.xz/plugins/obs-ffmpeg/data/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/plugins/obs-ffmpeg/data/locale/en-US.ini
Changed
Looping="Loop"
Input="Input"
InputFormat="Input Format"
-ForceFormat="Force format conversion"
HardwareDecode="Use hardware decoding when available"
ClearOnMediaEnd="Hide source when playback ends"
Advanced="Advanced"
-AudioBufferSize="Audio Buffer Size (frames)"
-VideoBufferSize="Video Buffer Size (frames)"
-FrameDropping="Frame Dropping Level"
-DiscardNone="None"
-DiscardDefault="Default (Invalid Packets)"
-DiscardNonRef="Non-Reference Frames"
-DiscardBiDir="Bi-Directional Frames"
-DiscardNonIntra="Non-Intra Frames"
-DiscardNonKey="Non-Key Frames"
-DiscardAll="All Frames (Careful!)"
RestartWhenActivated="Restart playback when source becomes active"
+CloseFileWhenInactive="Close file when inactive"
+CloseFileWhenInactive.ToolTip="Closes the file when the source is not being displayed on the stream or\nrecording. This allows the file to be changed when the source isn't active,\nbut there may be some startup delay when the source reactivates."
ColorRange="YUV Color Range"
ColorRange.Auto="Auto"
ColorRange.Partial="Partial"
obs-studio-18.0.1.tar.xz/plugins/obs-ffmpeg/obs-ffmpeg-nvenc.c -> obs-studio-18.0.2.tar.xz/plugins/obs-ffmpeg/obs-ffmpeg-nvenc.c
Changed
rc = "CBR";
}
+ /* The "default" preset has been deprecated */
+ if (preset && astrcmpi(preset, "default") == 0)
+ preset = "hq";
+
info.format = voi->format;
info.colorspace = voi->colorspace;
info.range = voi->range;
obs_data_set_default_int(settings, "keyint_sec", 0);
obs_data_set_default_int(settings, "cqp", 23);
obs_data_set_default_string(settings, "rate_control", "CBR");
- obs_data_set_default_string(settings, "preset", "default");
+ obs_data_set_default_string(settings, "preset", "hq");
obs_data_set_default_string(settings, "profile", "main");
obs_data_set_default_string(settings, "level", "auto");
obs_data_set_default_bool(settings, "2pass", true);
#define add_preset(val) \
obs_property_list_add_string(p, obs_module_text("NVENC.Preset." val), \
val)
- add_preset("default");
add_preset("hq");
add_preset("hp");
add_preset("bd");
obs-studio-18.0.1.tar.xz/plugins/obs-ffmpeg/obs-ffmpeg-source.c -> obs-studio-18.0.2.tar.xz/plugins/obs-ffmpeg/obs-ffmpeg-source.c
Changed
#include "obs-ffmpeg-compat.h"
#include "obs-ffmpeg-formats.h"
-#include <libff/ff-demuxer.h>
-
-#include <libswscale/swscale.h>
+#include <media-playback/media.h>
#define FF_LOG(level, format, ...) \
blog(level, "[Media Source]: " format, ##__VA_ARGS__)
static bool video_format(AVCodecContext *codec_context, void *opaque);
struct ffmpeg_source {
- struct ff_demuxer *demuxer;
+ mp_media_t media;
+ bool media_valid;
+ bool destroy_media;
+
struct SwsContext *sws_ctx;
int sws_width;
int sws_height;
enum AVPixelFormat sws_format;
uint8_t *sws_data;
int sws_linesize;
+ enum video_range_type range;
obs_source_t *source;
char *input;
char *input_format;
- enum AVDiscard frame_drop;
- enum video_range_type range;
- int audio_buffer_size;
- int video_buffer_size;
- bool is_advanced;
bool is_looping;
- bool is_forcing_scale;
bool is_hw_decoding;
bool is_clear_on_media_end;
bool restart_on_activate;
+ bool close_when_inactive;
};
-static bool set_obs_frame_colorprops(struct ff_frame *frame,
- struct ffmpeg_source *s, struct obs_source_frame *obs_frame)
-{
- enum AVColorSpace frame_cs = av_frame_get_colorspace(frame->frame);
- enum video_colorspace obs_cs;
-
- switch(frame_cs) {
- case AVCOL_SPC_BT709: obs_cs = VIDEO_CS_709; break;
- case AVCOL_SPC_SMPTE170M:
- case AVCOL_SPC_BT470BG: obs_cs = VIDEO_CS_601; break;
- case AVCOL_SPC_UNSPECIFIED: obs_cs = VIDEO_CS_DEFAULT; break;
- default:
- FF_BLOG(LOG_WARNING, "frame using an unsupported colorspace %d",
- frame_cs);
- obs_cs = VIDEO_CS_DEFAULT;
- }
-
- enum video_range_type range;
- obs_frame->format = ffmpeg_to_obs_video_format(frame->frame->format);
- obs_frame->full_range =
- frame->frame->color_range == AVCOL_RANGE_JPEG;
-
- if (s->range != VIDEO_RANGE_DEFAULT)
- obs_frame->full_range = s->range == VIDEO_RANGE_FULL;
-
- range = obs_frame->full_range ? VIDEO_RANGE_FULL : VIDEO_RANGE_PARTIAL;
-
- if (!video_format_get_parameters(obs_cs,
- range, obs_frame->color_matrix,
- obs_frame->color_range_min,
- obs_frame->color_range_max)) {
- FF_BLOG(LOG_ERROR, "Failed to get video format "
- "parameters for video format %u",
- obs_cs);
- return false;
- }
-
- return true;
-}
-
-bool update_sws_context(struct ffmpeg_source *s, AVFrame *frame)
-{
- if (frame->width != s->sws_width
- || frame->height != s->sws_height
- || frame->format != s->sws_format) {
- if (s->sws_ctx != NULL)
- sws_freeContext(s->sws_ctx);
-
- if (frame->width <= 0 || frame->height <= 0) {
- FF_BLOG(LOG_ERROR, "unable to create a sws "
- "context that has a width(%d) or "
- "height(%d) of zero.", frame->width,
- frame->height);
- goto fail;
- }
-
- s->sws_ctx = sws_getContext(
- frame->width,
- frame->height,
- frame->format,
- frame->width,
- frame->height,
- AV_PIX_FMT_BGRA,
- SWS_BILINEAR,
- NULL, NULL, NULL);
-
- if (s->sws_ctx == NULL) {
- FF_BLOG(LOG_ERROR, "unable to create sws "
- "context with src{w:%d,h:%d,f:%d}->"
- "dst{w:%d,h:%d,f:%d}",
- frame->width, frame->height,
- frame->format, frame->width,
- frame->height, AV_PIX_FMT_BGRA);
- goto fail;
-
- }
-
- if (s->sws_data != NULL)
- bfree(s->sws_data);
- s->sws_data = bzalloc(frame->width * frame->height * 4);
- if (s->sws_data == NULL) {
- FF_BLOG(LOG_ERROR, "unable to allocate sws "
- "pixel data with size %d",
- frame->width * frame->height * 4);
- goto fail;
- }
-
- s->sws_linesize = frame->width * 4;
- s->sws_width = frame->width;
- s->sws_height = frame->height;
- s->sws_format = frame->format;
- }
-
- return true;
-
-fail:
- if (s->sws_ctx != NULL)
- sws_freeContext(s->sws_ctx);
- s->sws_ctx = NULL;
-
- if (s->sws_data)
- bfree(s->sws_data);
- s->sws_data = NULL;
-
- s->sws_linesize = 0;
- s->sws_width = 0;
- s->sws_height = 0;
- s->sws_format = 0;
-
- return false;
-}
-
-static bool video_frame_scale(struct ff_frame *frame,
- struct ffmpeg_source *s, struct obs_source_frame *obs_frame)
-{
- if (!update_sws_context(s, frame->frame))
- return false;
-
- sws_scale(
- s->sws_ctx,
- (uint8_t const *const *)frame->frame->data,
- frame->frame->linesize,
- 0,
- frame->frame->height,
- &s->sws_data,
- &s->sws_linesize
- );
-
- obs_frame->data[0] = s->sws_data;
- obs_frame->linesize[0] = s->sws_linesize;
- obs_frame->format = VIDEO_FORMAT_BGRA;
-
- obs_source_output_video(s->source, obs_frame);
-
- return true;
-}
-
-static bool video_frame_hwaccel(struct ff_frame *frame,
- struct ffmpeg_source *s, struct obs_source_frame *obs_frame)
-{
- // 4th plane is pixelbuf reference for mac
- for (int i = 0; i < 3; i++) {
- obs_frame->data[i] = frame->frame->data[i];
- obs_frame->linesize[i] = frame->frame->linesize[i];
- }
-
- if (!set_obs_frame_colorprops(frame, s, obs_frame))
- return false;
-
- obs_source_output_video(s->source, obs_frame);
- return true;
-}
-
-static bool video_frame_direct(struct ff_frame *frame,
- struct ffmpeg_source *s, struct obs_source_frame *obs_frame)
-{
- int i;
-
- for (i = 0; i < MAX_AV_PLANES; i++) {
- obs_frame->data[i] = frame->frame->data[i];
- obs_frame->linesize[i] = frame->frame->linesize[i];
- }
-
- if (!set_obs_frame_colorprops(frame, s, obs_frame))
- return false;
-
- obs_source_output_video(s->source, obs_frame);
- return true;
-}
-
-static bool video_frame(struct ff_frame *frame, void *opaque)
-{
- struct ffmpeg_source *s = opaque;
- struct obs_source_frame obs_frame = {0};
- uint64_t pts;
-
- // Media ended
- if (frame == NULL) {
- if (s->is_clear_on_media_end)
- obs_source_output_video(s->source, NULL);
- return true;
- }
-
- pts = (uint64_t)(frame->pts * 1000000000.0L);
-
- obs_frame.timestamp = pts;
- obs_frame.width = frame->frame->width;
- obs_frame.height = frame->frame->height;
-
- enum video_format format =
- ffmpeg_to_obs_video_format(frame->frame->format);
-
- if (s->is_forcing_scale || format == VIDEO_FORMAT_NONE)
- return video_frame_scale(frame, s, &obs_frame);
- else if (s->is_hw_decoding)
- return video_frame_hwaccel(frame, s, &obs_frame);
- else
- return video_frame_direct(frame, s, &obs_frame);
-}
-
-static bool audio_frame(struct ff_frame *frame, void *opaque)
-{
- struct ffmpeg_source *s = opaque;
-
- struct obs_source_audio audio_data = {0};
-
- uint64_t pts;
-
- // Media ended
- if (frame == NULL || frame->frame == NULL)
- return true;
-
- pts = (uint64_t)(frame->pts * 1000000000.0L);
-
- int channels = av_frame_get_channels(frame->frame);
-
- for(int i = 0; i < channels; i++)
- audio_data.data[i] = frame->frame->data[i];
-
- audio_data.samples_per_sec = frame->frame->sample_rate;
- audio_data.frames = frame->frame->nb_samples;
- audio_data.timestamp = pts;
- audio_data.format =
- convert_ffmpeg_sample_format(frame->frame->format);
- audio_data.speakers = channels;
-
- obs_source_output_audio(s->source, &audio_data);
-
- return true;
-}
-
static bool is_local_file_modified(obs_properties_t *props,
obs_property_t *prop, obs_data_t *settings)
{
return true;
}
-static bool is_advanced_modified(obs_properties_t *props,
- obs_property_t *prop, obs_data_t *settings)
-{
- UNUSED_PARAMETER(prop);
-
- bool enabled = obs_data_get_bool(settings, "advanced");
- obs_property_t *fscale = obs_properties_get(props, "force_scale");
- obs_property_t *abuf = obs_properties_get(props, "audio_buffer_size");
- obs_property_t *vbuf = obs_properties_get(props, "video_buffer_size");
- obs_property_t *frame_drop = obs_properties_get(props, "frame_drop");
- obs_property_t *color_range = obs_properties_get(props, "color_range");
- obs_property_set_visible(fscale, enabled);
- obs_property_set_visible(abuf, enabled);
- obs_property_set_visible(vbuf, enabled);
- obs_property_set_visible(frame_drop, enabled);
- obs_property_set_visible(color_range, enabled);
-
- return true;
-}
-
static void ffmpeg_source_defaults(obs_data_t *settings)
{
obs_data_set_default_bool(settings, "is_local_file", true);
obs_data_set_default_bool(settings, "looping", false);
obs_data_set_default_bool(settings, "clear_on_media_end", true);
obs_data_set_default_bool(settings, "restart_on_activate", true);
- obs_data_set_default_bool(settings, "force_scale", true);
#if defined(_WIN32)
obs_data_set_default_bool(settings, "hw_decode", true);
#endif
dstr_free(&filter);
dstr_free(&path);
- obs_properties_add_bool(props, "looping", obs_module_text("Looping"));
+ prop = obs_properties_add_bool(props, "looping",
+ obs_module_text("Looping"));
obs_properties_add_bool(props, "restart_on_activate",
obs_module_text("RestartWhenActivated"));
obs_properties_add_bool(props, "clear_on_media_end",
obs_module_text("ClearOnMediaEnd"));
- prop = obs_properties_add_bool(props, "advanced",
- obs_module_text("Advanced"));
-
- obs_property_set_modified_callback(prop, is_advanced_modified);
-
- obs_properties_add_bool(props, "force_scale",
- obs_module_text("ForceFormat"));
-
- prop = obs_properties_add_int(props, "audio_buffer_size",
- obs_module_text("AudioBufferSize"), 1, 9999, 1);
-
- obs_property_set_visible(prop, false);
+ prop = obs_properties_add_bool(props, "close_when_inactive",
+ obs_module_text("CloseFileWhenInactive"));
- prop = obs_properties_add_int(props, "video_buffer_size",
- obs_module_text("VideoBufferSize"), 1, 9999, 1);
-
- obs_property_set_visible(prop, false);
-
- prop = obs_properties_add_list(props, "frame_drop",
- obs_module_text("FrameDropping"), OBS_COMBO_TYPE_LIST,
- OBS_COMBO_FORMAT_INT);
-
- obs_property_list_add_int(prop, obs_module_text("DiscardNone"),
- AVDISCARD_NONE);
- obs_property_list_add_int(prop, obs_module_text("DiscardDefault"),
- AVDISCARD_DEFAULT);
- obs_property_list_add_int(prop, obs_module_text("DiscardNonRef"),
- AVDISCARD_NONREF);
- obs_property_list_add_int(prop, obs_module_text("DiscardBiDir"),
- AVDISCARD_BIDIR);
-#if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(55, 67, 100)
- obs_property_list_add_int(prop, obs_module_text("DiscardNonIntra"),
- AVDISCARD_NONINTRA);
-#endif
- obs_property_list_add_int(prop, obs_module_text("DiscardNonKey"),
- AVDISCARD_NONKEY);
- obs_property_list_add_int(prop, obs_module_text("DiscardAll"),
- AVDISCARD_ALL);
-
- obs_property_set_visible(prop, false);
+ obs_property_set_long_description(prop,
+ obs_module_text("CloseFileWhenInactive.ToolTip"));
prop = obs_properties_add_list(props, "color_range",
obs_module_text("ColorRange"), OBS_COMBO_TYPE_LIST,
obs_property_list_add_int(prop, obs_module_text("ColorRange.Full"),
VIDEO_RANGE_FULL);
- obs_property_set_visible(prop, false);
-
return props;
}
-static const char *frame_drop_to_str(enum AVDiscard discard)
-{
-#define DISCARD_CASE(x) case AVDISCARD_ ## x: return "AVDISCARD_" #x
- switch (discard)
- {
- DISCARD_CASE(NONE);
- DISCARD_CASE(DEFAULT);
- DISCARD_CASE(NONREF);
- DISCARD_CASE(BIDIR);
-#if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(55, 67, 100)
- DISCARD_CASE(NONINTRA);
-#endif
- DISCARD_CASE(NONKEY);
- DISCARD_CASE(ALL);
- default: return "(Unknown)";
- };
-#undef DISCARD_CASE
-}
-
static void dump_source_info(struct ffmpeg_source *s, const char *input,
- const char *input_format, bool is_advanced)
+ const char *input_format)
{
FF_BLOG(LOG_INFO,
"settings:\n"
"\tinput: %s\n"
"\tinput_format: %s\n"
"\tis_looping: %s\n"
- "\tis_forcing_scale: %s\n"
"\tis_hw_decoding: %s\n"
"\tis_clear_on_media_end: %s\n"
- "\trestart_on_activate: %s",
+ "\trestart_on_activate: %s\n"
+ "\tclose_when_inactive: %s",
input ? input : "(null)",
input_format ? input_format : "(null)",
s->is_looping ? "yes" : "no",
- s->is_forcing_scale ? "yes" : "no",
s->is_hw_decoding ? "yes" : "no",
s->is_clear_on_media_end ? "yes" : "no",
- s->restart_on_activate ? "yes" : "no");
+ s->restart_on_activate ? "yes" : "no",
+ s->close_when_inactive ? "yes" : "no");
+}
- if (!is_advanced)
- return;
+static void get_frame(void *opaque, struct obs_source_frame *f)
+{
+ struct ffmpeg_source *s = opaque;
+ obs_source_output_video(s->source, f);
+}
- FF_BLOG(LOG_INFO,
- "advanced settings:\n"
- "\taudio_buffer_size: %d\n"
- "\tvideo_buffer_size: %d\n"
- "\tframe_drop: %s",
- s->audio_buffer_size,
- s->video_buffer_size,
- frame_drop_to_str(s->frame_drop));
+static void preload_frame(void *opaque, struct obs_source_frame *f)
+{
+ struct ffmpeg_source *s = opaque;
+ obs_source_preload_video(s->source, f);
}
-static void ffmpeg_source_start(struct ffmpeg_source *s)
+static void get_audio(void *opaque, struct obs_source_audio *a)
{
- if (s->demuxer != NULL)
- ff_demuxer_free(s->demuxer);
-
- s->demuxer = ff_demuxer_init();
- s->demuxer->options.is_hw_decoding = s->is_hw_decoding;
- s->demuxer->options.is_looping = s->is_looping;
-
- ff_demuxer_set_callbacks(&s->demuxer->video_callbacks,
- video_frame, NULL,
- NULL, NULL, NULL, s);
-
- ff_demuxer_set_callbacks(&s->demuxer->audio_callbacks,
- audio_frame, NULL,
- NULL, NULL, NULL, s);
-
- if (s->is_advanced) {
- s->demuxer->options.audio_frame_queue_size =
- s->audio_buffer_size;
- s->demuxer->options.video_frame_queue_size =
- s->video_buffer_size;
- s->demuxer->options.frame_drop = s->frame_drop;
+ struct ffmpeg_source *s = opaque;
+ obs_source_output_audio(s->source, a);
+}
+
+static void media_stopped(void *opaque)
+{
+ struct ffmpeg_source *s = opaque;
+ if (s->is_clear_on_media_end) {
+ obs_source_output_video(s->source, NULL);
+ if (s->close_when_inactive)
+ s->destroy_media = true;
}
+}
- ff_demuxer_open(s->demuxer, s->input, s->input_format);
+static void ffmpeg_source_open(struct ffmpeg_source *s)
+{
+ if (s->input && *s->input)
+ s->media_valid = mp_media_init(&s->media,
+ s->input, s->input_format,
+ s, get_frame, get_audio, media_stopped,
+ preload_frame, s->is_hw_decoding, s->range);
+}
+
+static void ffmpeg_source_tick(void *data, float seconds)
+{
+ struct ffmpeg_source *s = data;
+ if (s->destroy_media) {
+ if (s->media_valid) {
+ mp_media_free(&s->media);
+ s->media_valid = false;
+ }
+ s->destroy_media = false;
+ }
+}
+
+static void ffmpeg_source_start(struct ffmpeg_source *s)
+{
+ if (!s->media_valid)
+ ffmpeg_source_open(s);
+
+ if (s->media_valid) {
+ mp_media_play(&s->media, s->is_looping);
+ obs_source_show_preloaded_video(s->source);
+ }
}
static void ffmpeg_source_update(void *data, obs_data_t *settings)
struct ffmpeg_source *s = data;
bool is_local_file = obs_data_get_bool(settings, "is_local_file");
- bool is_advanced = obs_data_get_bool(settings, "advanced");
char *input;
char *input_format;
input = (char *)obs_data_get_string(settings, "local_file");
input_format = NULL;
s->is_looping = obs_data_get_bool(settings, "looping");
+
+ obs_source_set_flags(s->source, OBS_SOURCE_FLAG_UNBUFFERED);
} else {
input = (char *)obs_data_get_string(settings, "input");
input_format = (char *)obs_data_get_string(settings,
"input_format");
s->is_looping = false;
+
+ obs_source_set_flags(s->source, 0);
}
s->input = input ? bstrdup(input) : NULL;
s->input_format = input_format ? bstrdup(input_format) : NULL;
- s->is_advanced = is_advanced;
s->is_hw_decoding = obs_data_get_bool(settings, "hw_decode");
s->is_clear_on_media_end = obs_data_get_bool(settings,
"clear_on_media_end");
s->restart_on_activate = obs_data_get_bool(settings,
"restart_on_activate");
- s->is_forcing_scale = true;
- s->range = VIDEO_RANGE_DEFAULT;
-
- if (is_advanced) {
- s->audio_buffer_size = (int)obs_data_get_int(settings,
- "audio_buffer_size");
- s->video_buffer_size = (int)obs_data_get_int(settings,
- "video_buffer_size");
- s->frame_drop = (enum AVDiscard)obs_data_get_int(settings,
- "frame_drop");
- s->is_forcing_scale = obs_data_get_bool(settings,
- "force_scale");
- s->range = (enum video_range_type)obs_data_get_int(settings,
- "color_range");
-
- if (s->audio_buffer_size < 1) {
- s->audio_buffer_size = 1;
- FF_BLOG(LOG_WARNING, "invalid audio_buffer_size %d",
- s->audio_buffer_size);
- }
- if (s->video_buffer_size < 1) {
- s->video_buffer_size = 1;
- FF_BLOG(LOG_WARNING, "invalid audio_buffer_size %d",
- s->audio_buffer_size);
- }
-
- if (s->frame_drop < AVDISCARD_NONE ||
- s->frame_drop > AVDISCARD_ALL) {
- s->frame_drop = AVDISCARD_DEFAULT;
- FF_BLOG(LOG_WARNING, "invalid frame_drop %d",
- s->frame_drop);
- }
+ s->close_when_inactive = obs_data_get_bool(settings,
+ "close_when_inactive");
+ s->range = (enum video_range_type)obs_data_get_int(settings,
+ "color_range");
+
+ if (s->media_valid) {
+ mp_media_free(&s->media);
+ s->media_valid = false;
}
- dump_source_info(s, input, input_format, is_advanced);
- if (!s->restart_on_activate || obs_source_active(s->source))
+ bool active = obs_source_active(s->source);
+ if (!s->close_when_inactive || active)
+ ffmpeg_source_open(s);
+
+ dump_source_info(s, input, input_format);
+ if (!s->restart_on_activate || active)
ffmpeg_source_start(s);
}
{
struct ffmpeg_source *s = data;
- if (s->demuxer)
- ff_demuxer_free(s->demuxer);
+ if (s->media_valid)
+ mp_media_free(&s->media);
if (s->sws_ctx != NULL)
sws_freeContext(s->sws_ctx);
struct ffmpeg_source *s = data;
if (s->restart_on_activate) {
- if (s->demuxer != NULL) {
- ff_demuxer_free(s->demuxer);
- s->demuxer = NULL;
+ if (s->media_valid) {
+ mp_media_stop(&s->media);
if (s->is_clear_on_media_end)
obs_source_output_video(s->source, NULL);
.get_properties = ffmpeg_source_getproperties,
.activate = ffmpeg_source_activate,
.deactivate = ffmpeg_source_deactivate,
+ .video_tick = ffmpeg_source_tick,
.update = ffmpeg_source_update
};
obs-studio-18.0.1.tar.xz/plugins/obs-outputs/flv-mux.c -> obs-studio-18.0.2.tar.xz/plugins/obs-outputs/flv-mux.c
Changed
s_wb24(s, get_ms_time(packet, offset));
s_write(s, packet->data, packet->size);
- /* write tag size (starting byte doesnt count) */
+ /* write tag size (starting byte doesn't count) */
s_wb32(s, (uint32_t)serializer_get_pos(s) + 4 - 1);
}
s_w8(s, is_header ? 0 : 1);
s_write(s, packet->data, packet->size);
- /* write tag size (starting byte doesnt count) */
+ /* write tag size (starting byte doesn't count) */
s_wb32(s, (uint32_t)serializer_get_pos(s) + 4 - 1);
}
obs-studio-18.0.1.tar.xz/plugins/obs-outputs/rtmp-stream.c -> obs-studio-18.0.2.tar.xz/plugins/obs-outputs/rtmp-stream.c
Changed
info("Connecting to RTMP URL %s...", stream->path.array);
- memset(&stream->rtmp.Link, 0, sizeof(stream->rtmp.Link));
+ RTMP_Init(&stream->rtmp);
if (!RTMP_SetupURL(&stream->rtmp, stream->path.array))
return OBS_OUTPUT_BAD_PATH;
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/QSV_Encoder.cpp -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/QSV_Encoder.cpp
Changed
return QSV_CPU_PLATFORM_HSW;
}
- //assume newer revisions are at least as capable as haswell
+ //assume newer revisions are at least as capable as Haswell
return QSV_CPU_PLATFORM_INTEL;
}
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/QSV_Encoder_Internal.cpp -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/QSV_Encoder_Internal.cpp
Changed
}
for (;;) {
- // Encode a frame asychronously (returns immediately)
+ // Encode a frame asynchronously (returns immediately)
sts = m_pmfxENC->EncodeFrameAsync(NULL, pSurface,
&m_pTaskPool[nTaskIdx].mfxBS,
&m_pTaskPool[nTaskIdx].syncp);
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/common_directx11.cpp -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/common_directx11.cpp
Changed
} else {
pSurface->GetDesc(&desc);
- // copy data only in case of user wants o read from stored surface
+ // copy data only in case of user wants to read from stored surface
if (memId->rw & WILL_READ)
g_pD3D11Ctx->CopySubresourceRegion(pStage, 0, 0, 0, 0, pSurface, 0, NULL);
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/common_directx9.h -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/common_directx9.h
Changed
class IGFXS3DControl;
/** Direct3D 9 device implementation.
-@note Can be initilized for only 1 or two 2 views. Handle to
+@note Can be initialized for only 1 or two 2 views. Handle to
MFX_HANDLE_GFXS3DCONTROL must be set prior if initializing for 2 views.
@note Device always set D3DPRESENT_PARAMETERS::Windowed to TRUE.
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/common_utils.h -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/common_utils.h
Changed
void PrintErrString(int err,const char* filestr,int line);
// LoadRawFrame: Reads raw frame from YUV file (YV12) into NV12 surface
-// - YV12 is a more common format for for YUV files than NV12 (therefore the conversion during read and write)
+// - YV12 is a more common format for YUV files than NV12 (therefore the conversion during read and write)
// - For the simulation case (fSource = NULL), the surface is filled with default image data
// LoadRawRGBFrame: Reads raw RGB32 frames from file into RGB32 surface
// - For the simulation case (fSource = NULL), the surface is filled with default image data
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/device_directx9.cpp -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/device_directx9.cpp
Changed
#if defined(WIN32) || defined(WIN64)
-//prefast singnature used in combaseapi.h
+//prefast signature used in combaseapi.h
#ifndef _PREFAST_
#pragma warning(disable:4068)
#endif
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/device_directx9.h -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/device_directx9.h
Changed
}; //mfxHandleType
/** Direct3D 9 device implementation.
-@note Can be initilized for only 1 or two 2 views. Handle to
+@note Can be initialized for only 1 or two 2 views. Handle to
MFX_HANDLE_GFXS3DCONTROL must be set prior if initializing for 2 views.
@note Device always set D3DPRESENT_PARAMETERS::Windowed to TRUE.
obs-studio-18.0.1.tar.xz/plugins/obs-qsv11/obs-qsv11.c -> obs-studio-18.0.2.tar.xz/plugins/obs-qsv11/obs-qsv11.c
Changed
//int iType = iFrame ? 0 : (bFrame ? 1 : (pFrame ? 2 : -1));
//int64_t interval = obsqsv->params.nbFrames + 1;
- // In case MSDK does't support automatic DecodeTimeStamp, do manual
+ // In case MSDK doesn't support automatic DecodeTimeStamp, do manual
// calculation
if (g_pts2dtsShift >= 0)
{
obs-studio-18.0.1.tar.xz/plugins/rtmp-services/data/package.json -> obs-studio-18.0.2.tar.xz/plugins/rtmp-services/data/package.json
Changed
{
"url": "https://obsproject.com/obs2_update/rtmp-services",
- "version": 52,
+ "version": 56,
"files": [
{
"name": "services.json",
- "version": 52
+ "version": 56
}
]
}
obs-studio-18.0.1.tar.xz/plugins/rtmp-services/data/services.json -> obs-studio-18.0.2.tar.xz/plugins/rtmp-services/data/services.json
Changed
],
"recommended": {
"keyint": 2,
- "max video bitrate": 3500,
+ "max video bitrate": 6000,
"max audio bitrate": 160,
"x264opts": "scenecut=0"
}
"url": "rtmp://singapore.restream.io/live"
},
{
+ "name": "Asia (Seoul, South Korea)",
+ "url": "rtmp://seoul.restream.io/live"
+ },
+ {
"name": "Australia (Sydney)",
"url": "rtmp://au.restream.io/live"
}
"recommended": {
"keyint": 2,
"profile": "main",
- "max video bitrate": 2000,
- "max audio bitrate": 160
+ "max video bitrate": 3500,
+ "max audio bitrate": 128
}
},
{
"url": "rtmp://plive.pandora.tv:80/mediaHub"
}
]
+ },
+ {
+ "name": "LiveStream",
+ "servers": [
+ {
+ "name": "Primary",
+ "url": "rtmp://rtmpin.livestreamingest.com/rtmpin"
+ }
+ ]
}
]
}
obs-studio-18.0.1.tar.xz/plugins/win-capture/data/locale/en-US.ini -> obs-studio-18.0.2.tar.xz/plugins/win-capture/data/locale/en-US.ini
Changed
WindowCapture="Window Capture"
WindowCapture.Window="Window"
WindowCapture.Priority="Window Match Priority"
-WindowCapture.Priority.Title="Window Title"
-WindowCapture.Priority.Class="Window Class"
-WindowCapture.Priority.Exe="Executable Name"
+WindowCapture.Priority.Title="Window title must match"
+WindowCapture.Priority.Class="Match title, otherwise find window of same type"
+WindowCapture.Priority.Exe="Match title, otherwise find window of same executable"
CaptureCursor="Capture Cursor"
Compatibility="Multi-adapter Compatibility"
AllowTransparency="Allow Transparency"
obs-studio-18.0.1.tar.xz/plugins/win-capture/game-capture.c -> obs-studio-18.0.2.tar.xz/plugins/win-capture/game-capture.c
Changed
reset_frame_interval(gc);
obs_enter_graphics();
- if (!gs_shared_texture_available())
- gc->global_hook_info->force_shmem = true;
- obs_leave_graphics();
-
- obs_enter_graphics();
- if (!gs_shared_texture_available())
+ if (!gs_shared_texture_available()) {
+ warn("init_hook_info: shared texture capture unavailable");
gc->global_hook_info->force_shmem = true;
+ }
obs_leave_graphics();
return true;
obs-studio-18.0.1.tar.xz/plugins/win-capture/get-graphics-offsets/d3d9-offsets.cpp -> obs-studio-18.0.2.tar.xz/plugins/win-capture/get-graphics-offsets/d3d9-offsets.cpp
Changed
#ifdef _WIN64
-#define CMP_SIZE 21
-
-static const uint8_t mask[CMP_SIZE] =
-{0xF8, 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
- 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
- 0xFF, 0x00,
- 0xF8, 0xF8, 0x00, 0x00, 0x00, 0x00};
-
-static const uint8_t mask_cmp[CMP_SIZE] =
-{0x48, 0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
- 0x39, 0x80, 0x00, 0x00, 0x00, 0x00,
- 0x75, 0x00,
- 0x40, 0xB8, 0x00, 0x00, 0x00, 0x00};
+#define MAX_CMP_SIZE 22
+
+static const uint8_t mask[][MAX_CMP_SIZE] = {
+ {
+ 0xF8, 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0xFF, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0x00,
+ 0xF8, 0xF8, 0x00, 0x00, 0x00, 0x00
+ },
+ {
+ 0xF8, 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0x00,
+ 0xF8, 0xF8, 0x00, 0x00, 0x00, 0x00
+ }
+};
+
+static const uint8_t mask_cmp[][MAX_CMP_SIZE] = {
+ /*
+ * Windows 7
+ * 48 8B 83 B8 3D 00 00 mov rax, [rbx+3DB8h]
+ * 44 39 B8 68 50 00 00 cmp [rax+5068h], r15d
+ * 75 12 jnz short loc_7FF7AA90530
+ * 41 B8 F9 19 00 00 mov r8d, 19F9h
+ */
+ {
+ 0x48, 0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x44, 0x39, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x75, 0x00,
+ 0x40, 0xB8, 0x00, 0x00, 0x00, 0x00
+ },
+ /*
+ * Windows ???+
+ * 49 8B 87 78 41 00 00 mov rax, [r15+4178h]
+ * 39 98 E0 51 00 00 cmp [rax+51E0h], ebx
+ * 75 12 jnz short loc_1800AEC9C
+ * 41 B9 C3 1A 00 00 mov r9d, 1AC3h
+ */
+ {
+ 0x48, 0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x39, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x75, 0x00,
+ 0x40, 0xB8, 0x00, 0x00, 0x00, 0x00
+ }
+};
+
+// Offset into the code for the numbers we're interested in
+static const uint32_t code_offsets[][2] = {
+ {3, 10},
+ {3, 9},
+};
#else
-#define CMP_SIZE 19
+#define MAX_CMP_SIZE 20
+
+static const uint8_t mask[][MAX_CMP_SIZE] = {
+ {
+ 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0x00,
+ 0xFF, 0x00, 0x00, 0x00, 0x00
+ },
+ {
+ 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
+ 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00, 0xFF,
+ 0xFF, 0x00,
+ 0xFF, 0x00, 0x00, 0x00, 0x00
+ }
+};
-static const uint8_t mask[CMP_SIZE] =
-{0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
- 0xFF, 0xC0, 0x00, 0x00, 0x00, 0x00,
- 0xFF, 0x00,
- 0xFF, 0x00, 0x00, 0x00, 0x00};
+static const uint8_t mask_cmp[][MAX_CMP_SIZE] = {
+ /*
+ * Windows 7+
+ * 8B 83 E8 29 00 00 mov eax, [ebx+29E8h]
+ * 39 B0 80 4B 00 00 cmp [eax+4B80h], esi
+ * 75 14 jnz short loc_754CD9E1
+ * 68 F9 19 00 00 push 19F9h
+ */
+ {
+ 0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x39, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x75, 0x00,
+ 0x68, 0x00, 0x00, 0x00, 0x00
+ },
+
+ /* Windows 10 Creator's Update+
+ * 8B 86 F8 2B 00 00 mov eax, [esi+2BF8h]
+ * 83 B8 00 4D 00 00 00 cmp dword ptr [eax+4D00h], 0
+ * 75 0F jnz short loc_100D793C
+ * 68 C3 1A 00 00 push 1AC3h
+ */
+ {
+ 0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
+ 0x83, 0x80, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x75, 0x00,
+ 0x68, 0x00, 0x00, 0x00, 0x00
+ }
+};
-static const uint8_t mask_cmp[CMP_SIZE] =
-{0x8B, 0x80, 0x00, 0x00, 0x00, 0x00,
- 0x39, 0x80, 0x00, 0x00, 0x00, 0x00,
- 0x75, 0x00,
- 0x68, 0x00, 0x00, 0x00, 0x00};
+// Offset into the code for the numbers we're interested in
+static const uint32_t code_offsets[][2] = {
+ {2, 8},
+ {2, 8},
+};
#endif
#define MAX_FUNC_SCAN_BYTES 200
-static inline bool pattern_matches(uint8_t *byte)
+static inline bool pattern_matches(uint8_t *byte, uint32_t *offset1,
+ uint32_t *offset2)
{
- for (size_t i = 0; i < CMP_SIZE; i++) {
- if ((byte[i] & mask[i]) != mask_cmp[i])
- return false;
+ for (size_t j = 0; j < sizeof(mask) / sizeof(mask[0]); j++) {
+ for (size_t i = 0; i < MAX_CMP_SIZE; i++) {
+ if ((byte[i] & mask[j][i]) != mask_cmp[j][i])
+ goto next_signature;
+ }
+
+ *offset1 = code_offsets[j][0];
+ *offset2 = code_offsets[j][1];
+
+ return true;
+next_signature:;
}
- return true;
+ return false;
}
void get_d3d9_offsets(struct d3d9_offsets *offsets)
offsets->present_swap = vtable_offset(info.module, info.swap,
3);
+ uint32_t offset1, offset2;
for (size_t i = 0; i < MAX_FUNC_SCAN_BYTES; i++) {
- if (pattern_matches(&crr[i])) {
+ if (pattern_matches(&crr[i], &offset1, &offset2)) {
#define get_offset(x) *(uint32_t*)&crr[i + x]
-#ifdef _WIN64
- uint32_t off1 = get_offset(3);
- uint32_t off2 = get_offset(9);
-#else
- uint32_t off1 = get_offset(2);
- uint32_t off2 = get_offset(8);
-#endif
+ uint32_t off1 = get_offset(offset1);
+ uint32_t off2 = get_offset(offset2);
/* check to make sure offsets are within
* expected values */
obs-studio-18.0.1.tar.xz/plugins/win-capture/graphics-hook/d3d8-capture.cpp -> obs-studio-18.0.2.tar.xz/plugins/win-capture/graphics-hook/d3d8-capture.cpp
Changed
return false;
}
- hlog("d3d8 memory capture successfull");
+ hlog("d3d8 memory capture successful");
return true;
}
obs-studio-18.0.1.tar.xz/plugins/win-capture/window-helpers.c -> obs-studio-18.0.2.tar.xz/plugins/win-capture/window-helpers.c
Changed
struct dstr cur_class = {0};
struct dstr cur_title = {0};
struct dstr cur_exe = {0};
- int class_val = 1;
- int title_val = 1;
- int exe_val = 0;
- int total = 0;
+ int val = 0x7FFFFFFF;
if (!get_window_exe(&cur_exe, window))
- return 0;
+ return 0x7FFFFFFF;
get_window_title(&cur_title, window);
get_window_class(&cur_class, window);
- if (priority == WINDOW_PRIORITY_CLASS)
- class_val += 3;
- else if (priority == WINDOW_PRIORITY_TITLE)
- title_val += 3;
- else
- exe_val += 3;
+ bool class_matches = dstr_cmpi(&cur_class, class) == 0;
+ bool exe_matches = dstr_cmpi(&cur_exe, exe) == 0;
+ int title_val = abs(dstr_cmpi(&cur_title, title));
+ /* always match by name with UWP windows */
if (uwp_window) {
- if (dstr_cmpi(&cur_title, title) == 0 &&
- dstr_cmpi(&cur_exe, exe) == 0)
- total += exe_val + title_val + class_val;
- } else {
- if (dstr_cmpi(&cur_class, class) == 0)
- total += class_val;
- if (dstr_cmpi(&cur_title, title) == 0)
- total += title_val;
- if (dstr_cmpi(&cur_exe, exe) == 0)
- total += exe_val;
+ if (priority == WINDOW_PRIORITY_EXE && !exe_matches)
+ val = 0x7FFFFFFF;
+ else
+ val = title_val == 0 ? 0 : 0x7FFFFFFF;
+
+ } else if (priority == WINDOW_PRIORITY_CLASS) {
+ val = class_matches ? title_val : 0x7FFFFFFF;
+ if (val != 0x7FFFFFFF && !exe_matches)
+ val += 0x1000;
+
+ } else if (priority == WINDOW_PRIORITY_TITLE) {
+ val = title_val == 0 ? 0 : 0x7FFFFFFF;
+
+ } else if (priority == WINDOW_PRIORITY_EXE) {
+ val = exe_matches ? title_val : 0x7FFFFFFF;
}
dstr_free(&cur_class);
dstr_free(&cur_title);
dstr_free(&cur_exe);
- return total;
+ return val;
}
HWND find_window(enum window_search_mode mode,
HWND window = first_window(mode, &parent, &use_findwindowex);
HWND best_window = NULL;
- int best_rating = 0;
+ int best_rating = 0x7FFFFFFF;
if (!class)
return NULL;
while (window) {
int rating = window_rating(window, priority, class, title, exe,
uwp_window);
- if (rating > best_rating) {
+ if (rating < best_rating) {
best_rating = rating;
best_window = window;
+ if (rating == 0)
+ break;
}
window = next_window(window, mode, &parent, use_findwindowex);
obs-studio-18.0.1.tar.xz/plugins/win-dshow/ffmpeg-decode.c -> obs-studio-18.0.2.tar.xz/plugins/win-dshow/ffmpeg-decode.c
Changed
if (decode->packet_size < new_size) {
decode->packet_buffer = brealloc(decode->packet_buffer,
new_size);
+ decode->packet_size = new_size;
}
memset(decode->packet_buffer + size, 0, FF_INPUT_BUFFER_PADDING_SIZE);
obs-studio-18.0.2.tar.xz/plugins/win-dshow/libdshowcapture/.gitattributes
Added
+* text=auto
+
+*.sln text eol=crlf
+*.vcproj text eol=crlf
+*.vcxproj text eol=crlf
+*.vcxproj text eol=crlf
+*.vcxproj.filters text eol=crlf
+
+cmake/ALL_BUILD.vcxproj.user.in text eol=crlf
obs-studio-18.0.1.tar.xz/plugins/win-dshow/win-dshow.cpp -> obs-studio-18.0.2.tar.xz/plugins/win-dshow/win-dshow.cpp
Changed
if (!thread)
throw "Failed to create thread";
+ deactivateWhenNotShowing =
+ obs_data_get_bool(settings, DEACTIVATE_WNS);
+
if (obs_data_get_bool(settings, "active")) {
- QueueAction(Action::Activate);
+ bool showing = obs_source_showing(source);
+ if (!deactivateWhenNotShowing || showing)
+ QueueAction(Action::Activate);
+
active = true;
}
}
obs-studio-18.0.1.tar.xz/plugins/win-ivcam/realsense.cpp -> obs-studio-18.0.2.tar.xz/plugins/win-ivcam/realsense.cpp
Changed
void IVCamSource::CamThread()
{
pSegServer = SegServer::CreateServer();
+
+ if (!pSegServer) {
+ warn("SegServer::CreateServer failed\n");
+ return;
+ }
+
SegServer::ServiceStatus status = pSegServer->Init();
if (status != SegServer::ServiceStatus::SERVICE_NO_ERROR) {
obs-studio-18.0.1.tar.xz/plugins/win-wasapi/win-wasapi.cpp -> obs-studio-18.0.2.tar.xz/plugins/win-wasapi/win-wasapi.cpp
Changed
if (!reconnectThread.Valid())
blog(LOG_WARNING, "[WASAPISource::Reconnect] "
- "Failed to intiialize reconnect thread: %lu",
+ "Failed to initialize reconnect thread: %lu",
GetLastError());
}
info.type = OBS_SOURCE_TYPE_INPUT;
info.output_flags = OBS_SOURCE_AUDIO |
OBS_SOURCE_DO_NOT_DUPLICATE |
- OBS_SOURCE_DO_NOT_MONITOR;
+ OBS_SOURCE_DO_NOT_SELF_MONITOR;
info.get_name = GetWASAPIOutputName;
info.create = CreateWASAPIOutput;
info.destroy = DestroyWASAPISource;
No build results available
No rpmlint results available
Request History
boombatower created request about 8 years ago
- Update to version 18.0.2:
* UI/updater: Fix temp files being created and not deleted
* UI/updater: Fix potential fail case when no files to patch
* UI/updater: Fixed a bug with deflating
* UI/updater: Ignore 64bit files on 32bit windows
* CI: Use ccache to speed up the build
* CI: OSX - Fix obs.png
* UI/updater: Fix incorrect inflate use
* CI: Linux - Install libfdk-aac-dev
* image-source: Move file modification check before animation processing
* UI: Add workaround to fix deleting final scene bug
* rtmp-services: Update ingest list for Restream.io
* rtmp-services: Update maximum bitrate for Twitch
* UI: Fix segfault when no system tray exists
* CI: Linux - Install FFmpeg from source
* obs-ffmpeg/nvenc: Remove "default" preset
* libobs: Add obs_source_copy_filters function
* UI: Add copying/pasting of sources/filters
* UI: Disable filter pasting when scene collection changed
* UI: Fix bug where items can't be deleted in last scene
* libobs: Remove unimplemented exports
* rtmp-services: Add Livestream service
* win-dshow: Fix issue with activating when not set to
* rtmp-services: Update Picarto maximum bitrates
* libobs: Delay stop detection of audio source
* libobs: Allow source to fully control source flags (for now)
* libobs: Add ability to preload async frames
* libobs: Remove multiple calls to free_type_data
* deps: Add media-playback static lib
* obs-ffmpeg: Change from libff to media-playback
boombatower accepted request about 8 years ago
ok