Mason Wheeler
The SDL_RenderGetLogicalSize function should always return the amount of pixels that are currently available for rendering to. But after updating to the latest SDL2, I started getting crashes because it was returning (0,0) as the logical size! After a bit of debugging, I tracked it down to the following code in SDL_SetRenderTarget:
if (texture) {
renderer->viewport.x = 0;
renderer->viewport.y = 0;
renderer->viewport.w = texture->w;
renderer->viewport.h = texture->h;
renderer->scale.x = 1.0f;
renderer->scale.y = 1.0f;
renderer->logical_w = 0;
renderer->logical_h = 0;
}
This is obviously wrong; 0 is never the correct value for a valid renderer. Those last two lines should read:
renderer->logical_w = texture->w;
renderer->logical_h = texture->h;
Charles Huber
If SDL_CreateTexture() takes the !IsSupportedFormat() path it will return a SDL_Texture* with a NULL driverdata member.
If you then SDL_GL_BindTexture() this will cause a segfault in GL_BindTexture() when it unconditionally dereferences driverdata.
This resolves lots of confusion around resizable windows. Most people don't expect a viewport to be implicitly set when the renderer is created and then not to be reset to the window size if the window is resized.
Added common test command line parameters --logical WxH and --scale N to test the render logical size and scaling APIs.
Marcus von Appen
If one wants to disable the SDL render subsystem, the build breaks on several platforms due to an empty render_drivers array in SDL_render.c.
This lets us change things like this...
if (Failed) {
SDL_SetError("We failed");
return -1;
}
...into this...
if (Failed) {
return SDL_SetError("We failed");
}
Fixes Bugzilla #1778.
Alexander Hirsch 2012-08-25 20:01:29 PDT
When creating a SDL_Texture with unsupported format (I'll now refer to it as
texture A), SDL_CreateTexture will call SDL_CreateTexture again with
GetClosestSupportedFormat to set texture->native (which I will now refer to as
texture B).
This causes texture B to be put before A in renderer->textures.
If texture A is explicitly destroyed, everything is fine. Otherwise, upon
SDL_DestroyRenderer, the loop will first encounter texture B, destroy it, then
texture A, destroy that which will want to destroy texture->native and since it
is already destroyed set an error.
The solution could be as simple as swapping texture A with B after
texture->native gets set in SDL_CreateTextures.
Frank Zago to SDL
For those interested, here's a snapshot of the current port. I did away with
most of the previous attempt which was based of the sprite engine, because the
support is limited to 128 64x64 sprites. Instead I'm using the gl engine.
The drawback is that either the frame buffer or the gl engine can be used
because there's not that much video memory on a DS.
With minimal changes to their code, it can now run the following tests: ,
testspriteminimal, testscale and testsprite2. The last 2 only run under the
emulator for some reason. The tests are not included in this patch for size
reason.
In 16 bits mode, the 16th bit indicated transparency/opacity. If 0, the color
is not displayed. So I had to patch a few core file to set that bit to 1. See
patch for src/video/SDL_RLEaccel.c and src/video/SDL_blit.h. Is that ok, or is
there a better way ?
The nds also doesn't support windowed mode, so I force the fullscreen in
src/video/SDL_video.c. Is that ok, or is there a better way ?
To get a smaller library, I also tried to not compile the software renderer
when the hardware renderer is compiled in, and define SDL_NO_COMPAT; however
the compilation eventually fails in SDL_surface.c because SDL_SRCCOLORKEY is
defined in SDL_compat.h. Is SDL_NO_COMPAT only for application and not SDL
itself ?