The implementation was slower than the C runtime on Mac OS X, Linux, and
Windows...quite a bit slower when using the C fallback instead of the inline
asm, too.
Fixes Bugzilla #1755.
Kang Seonghoon
While BMP format supports alpha channel, it is enabled only when the header is at least 56 bytes long (BITMAPV3INFOHEADER and later). For very common 40-byte-long header (BITMAPINFOHEADER) 32bpp format should be interpreted as BGRX format, but currently SDL interprets them as BGRA format and causes a significant compatibility problem as many 32bpp files use a padding byte of 0 ("transparent" in BGRA interpretation).
---
I fixed this by checking to see if the alpha channel is all 0, and if so, setting it opaque.
Kang Seonghoon
While BMP format supports alpha channel, it is enabled only when the header is at least 56 bytes long (BITMAPV3INFOHEADER and later). For very common 40-byte-long header (BITMAPINFOHEADER) 32bpp format should be interpreted as BGRX format, but currently SDL interprets them as BGRA format and causes a significant compatibility problem as many 32bpp files use a padding byte of 0 ("transparent" in BGRA interpretation).
Sylvain
1/ Load an Image XPM with IMG_ReadXPMFromArray()
2/ Try to set a ColorKey on it.
I notice that :
- the SDL_Surface that is created from XPM has a palette !
- the colorkey is a RBG.
it crashes (SIGSEGV) inside the SDL_SetColorKey function:
"surface->format->palette->colors[surface->map->info.colorkey].a"
Vittorio Giovara
I find that the calling point in SDL_uikitappdelegate.m is dangerous as the -(void) postFinishLaunch method can be overridden when subclassing.
Could this be moved in inside the init or in the didFinishLaunchingWithOptions method which are always called even when subclassed?
Having the SDL functions inline is causing build issues, and in the case of malloc(), etc. causing malloc/free mismatches, if the application build environment differs from the SDL build environment.
In the interest of safety and consistency, the functions will always be in the SDL library and will only be redirected to the C library there, if they are available.
See the following threads on the SDL mailing list for the gruesome details:
* SDL_stdinc.h inlines problematic when application not compiled in exact same feature environment
* Error compiling program against SDL2 with -std=c++11 g++ flag
Mason Wheeler
The SDL_RenderGetLogicalSize function should always return the amount of pixels that are currently available for rendering to. But after updating to the latest SDL2, I started getting crashes because it was returning (0,0) as the logical size! After a bit of debugging, I tracked it down to the following code in SDL_SetRenderTarget:
if (texture) {
renderer->viewport.x = 0;
renderer->viewport.y = 0;
renderer->viewport.w = texture->w;
renderer->viewport.h = texture->h;
renderer->scale.x = 1.0f;
renderer->scale.y = 1.0f;
renderer->logical_w = 0;
renderer->logical_h = 0;
}
This is obviously wrong; 0 is never the correct value for a valid renderer. Those last two lines should read:
renderer->logical_w = texture->w;
renderer->logical_h = texture->h;
Check the hint at initialization time, as an optimization. This isn't something we expect the application to change at runtime, and if it is we should add an API for it.