Nitz
In SDL_CreateTextureFromSurface:
SDL_PixelFormat *dst_fmt;
/* Set up a destination surface for the texture update */
dst_fmt = SDL_AllocFormat(format);
temp = SDL_ConvertSurface(surface, dst_fmt, 0);
Here is need of NULL check for dst_fmt because there are chances of NULL return from SDL_AllocFormat(format);
This fixes an issue where an empty cliprect is treated the same as a NULL
cliprect, causing the render backends to disable clipping.
Also adds a new API, SDL_RenderIsClipEnabled(render) that allows you to
differentiate between:
- SDL_RenderSetClipRect(render, NULL)
- SDL_Rect r = {0,0,0,0}; SDL_RenderSetClipRect(render, &r);
Fixes https://bugzilla.libsdl.org/show_bug.cgi?id=2504
--HG--
extra : amend_source : 9e5ac76e3f009d9ae49bc61c350df3ba891267b5
extra : histedit_source : b92b8be4d05b19a89fa0dee57f7ed6b1e924cb99%2Cce419f6ade87bafc78ff42702c1f263d2469e7e7
This is actually a false-positive, in this case, since Clang doesn't know
that SDL_SetError() only ever returns -1. Feature request to improve that,
with explanation about these specific SDL patches, is here:
http://llvm.org/bugs/show_bug.cgi?id=19208
--HG--
extra : amend_source : 7f0edf8fe0831e961cf800278174eebe915b3974
extra : histedit_source : 6c84fc38936198a2208e3e66b55130eb63262990
The D3D11 renderer is now slightly faster than D3D9 on my Windows 8 machine (testsprite2 runs at 3400 FPS vs 3100 FPS)
This will need tweaking to fix the Windows RT build.
--HG--
rename : src/render/direct3d11/SDL_render_d3d11.cpp => src/render/direct3d11/SDL_render_d3d11.c
SDL 2.x recently accepted patches to enable OpenGL ES 2 support via Google's ANGLE library. The thought is to try to eventually merge SDL/WinRT's OpenGL code with SDL-official's.
driedfruit
SDL_RenderCopy clips dstrect against the viewport. Then it adjusts the
srcrect by "appropriate" amount of pixels. This amount is actually
wrong, quite a lot, because of the rounding errors introduced in the "*
factor / factor" scale.
real_srcrect.x += (deltax * real_srcrect.w) / dstrect->w;
real_srcrect.w += (deltaw * real_srcrect.w) / dstrect->w;
For example:
I have a 32 x 32 srcrect and a 64 x 64 dstrect. So far the
stretching is done perfectly, by a factor of 2.
Now, consider dstrect being clipped against the viewport, so it becomes
56 x 64. Now, the factor becomes 1.75 ! The adjustment to "srcrect"
can't handle this, cause srcrect is in integers.
And thus we now have incorrect mapping, with dstrect not being in the
right proportion to srcrect.
The problem is most evident when upscaling stuff, like displaying a 8x8
texture with a zoom of 64 or more, and moving it beyond the corners of
the screen. It *looks* really really bad.
Note: RenderCopyEX does no such clipping, and is right to do so. The fix would be to remove any such clipping from RenderCopy too. And then fix the software renderer, because it has the same fault, independently of RenderCopy.
[attached patch]
this leaves Software Renderer buggy, as it does it's own clipping later on
Lloyd Bryant
SDL_CreateTexture() is succeeding (i.e. returning a valid pointer) when the requested horizontal or vertical size of the texture exceeds the maximum allowed by the render. This results in hard-to-understand errors showing up when later attempting to use that texture (such as with SDL_SetRenderTarget()).
Mason Wheeler
The SDL_RenderGetLogicalSize function should always return the amount of pixels that are currently available for rendering to. But after updating to the latest SDL2, I started getting crashes because it was returning (0,0) as the logical size! After a bit of debugging, I tracked it down to the following code in SDL_SetRenderTarget:
if (texture) {
renderer->viewport.x = 0;
renderer->viewport.y = 0;
renderer->viewport.w = texture->w;
renderer->viewport.h = texture->h;
renderer->scale.x = 1.0f;
renderer->scale.y = 1.0f;
renderer->logical_w = 0;
renderer->logical_h = 0;
}
This is obviously wrong; 0 is never the correct value for a valid renderer. Those last two lines should read:
renderer->logical_w = texture->w;
renderer->logical_h = texture->h;
Charles Huber
If SDL_CreateTexture() takes the !IsSupportedFormat() path it will return a SDL_Texture* with a NULL driverdata member.
If you then SDL_GL_BindTexture() this will cause a segfault in GL_BindTexture() when it unconditionally dereferences driverdata.
This resolves lots of confusion around resizable windows. Most people don't expect a viewport to be implicitly set when the renderer is created and then not to be reset to the window size if the window is resized.
Added common test command line parameters --logical WxH and --scale N to test the render logical size and scaling APIs.
Marcus von Appen
If one wants to disable the SDL render subsystem, the build breaks on several platforms due to an empty render_drivers array in SDL_render.c.
This lets us change things like this...
if (Failed) {
SDL_SetError("We failed");
return -1;
}
...into this...
if (Failed) {
return SDL_SetError("We failed");
}
Fixes Bugzilla #1778.