Fixed the logical size for rendering to texture, thanks to Mason Wheeler.
Mason Wheeler The SDL_RenderGetLogicalSize function should always return the amount of pixels that are currently available for rendering to. But after updating to the latest SDL2, I started getting crashes because it was returning (0,0) as the logical size! After a bit of debugging, I tracked it down to the following code in SDL_SetRenderTarget: if (texture) { renderer->viewport.x = 0; renderer->viewport.y = 0; renderer->viewport.w = texture->w; renderer->viewport.h = texture->h; renderer->scale.x = 1.0f; renderer->scale.y = 1.0f; renderer->logical_w = 0; renderer->logical_h = 0; } This is obviously wrong; 0 is never the correct value for a valid renderer. Those last two lines should read: renderer->logical_w = texture->w; renderer->logical_h = texture->h;
This commit is contained in:
parent
fd6dde1327
commit
7ff764a482
1 changed files with 2 additions and 2 deletions
|
@ -950,8 +950,8 @@ SDL_SetRenderTarget(SDL_Renderer *renderer, SDL_Texture *texture)
|
|||
renderer->viewport.h = texture->h;
|
||||
renderer->scale.x = 1.0f;
|
||||
renderer->scale.y = 1.0f;
|
||||
renderer->logical_w = 0;
|
||||
renderer->logical_h = 0;
|
||||
renderer->logical_w = texture->w;
|
||||
renderer->logical_h = texture->h;
|
||||
} else {
|
||||
renderer->viewport = renderer->viewport_backup;
|
||||
renderer->clip_rect = renderer->clip_rect_backup;
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue