Use the Cohen-Sutherland algorithm for line clipping which uses integer math and preserves ordering of clipped points.
Removed getopt() support in testsdl.c, replaced with simple argv scanning.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404285
John Popplewell 2009-12-08 23:05:50 PST
Originally reported by AKFoerster on the mailing list.
Error decoding UTF8 Russian text to UTF-16LE on Windows, but specifically on
platforms without iconv support (the default on Windows).
Valid UTF8 characters are flagged as being overlong and then substituted by the
UNKNOWN_UNICODE character.
After studying the testiconv.c example program, reading the RFCs and putting
some printf statements in SDL_iconv.c the problem is in a test for 'Maximum
overlong sequences', specifically 4.2.1, which is carried out by the following
code:
} else if ( p[0] >= 0xC0 ) {
if ( (p[0] & 0xE0) != 0xC0 ) {
/* Skip illegal sequences
return SDL_ICONV_EILSEQ;
*/
ch = UNKNOWN_UNICODE;
} else {
if ( (p[0] & 0xCE) == 0xC0 ) { <<<<<<<< here
overlong = SDL_TRUE;
}
ch = (Uint32)(p[0] & 0x1F);
left = 1;
}
} else {
Here is the 2-byte encoding of a character in range 00000080 - 000007FF
110xxxxx 10xxxxxx
The line in question is supposed to be checking for an overlong sequence which
would be less than
11000001 10111111
which should be represented as a single byte.
BUT, the mask value (0xCE) is wrong, it isn't checking the top-most bit:
11000001 value
11001110 mask (incorrect)
^
and should be (0xDE):
11000001 value
11011110 mask (correct)
making the above code:
} else if ( p[0] >= 0xC0 ) {
if ( (p[0] & 0xE0) != 0xC0 ) {
/* Skip illegal sequences
return SDL_ICONV_EILSEQ;
*/
ch = UNKNOWN_UNICODE;
} else {
if ( (p[0] & 0xDE) == 0xC0 ) { <<<<<<<< here
overlong = SDL_TRUE;
}
ch = (Uint32)(p[0] & 0x1F);
left = 1;
}
} else {
I can supply a test program and/or a patch if required,
best regards,
John Popplewell
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404283
SDL_GetDisplayBounds()
Implemented multi-monitor window positions on Windows
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404271
If it loses any of those properties the desktop mode will be restored.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404243
* Added display parameter to many internal functions so video modes can be set on displays that aren't the public current one.
* The fullscreen mode is associated with fullscreen windows - not displays, so different windows more naturally have a mode associated with them based on their width and height. It's no longer necessary to specify a fullscreen mode, a default one will be picked automatically for fullscreen windows.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404241
I did it around each call to cpuid which isn't strictly necessary, but is definitely future proof. :)
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404239
If type is ::SDL_HAPTIC_CARTESIAN, direction is encoded by three positions
367 * (X axis, Y axis and Z axis (with 3 axes)). ::SDL_HAPTIC_CARTESIAN
uses
368 * the first three \c dir parameters. The cardinal directions would
be:
369 * - North: 0,-1, 0
370 * - East: -1, 0, 0
371 * - South: 0, 1, 0
372 * - West: 1, 0, 0
typedef struct SDL_HapticDirection
{
Uint8 type; /**< The type of encoding. */
Uint16 dir[3]; /**< The encoded direction. */
} SDL_HapticDirection;
An unsigned int can't store negative values and I don't see an alternate way to
encode them in the docs or source. The best I have been able to come up with is
using a negative magnitude for the effect but this will only get me 2 of the 4
quadrants in the plane for 2d effects. I looked at the win32 and linux
implementations and I believe is is safe to use signed ints in the direction
struct. I am unfamiliar with the darwin haptics API so I don't know if it is
safe.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404237
D3D renderer shall try mapping YV12 and I420 (IYUV) to D3D texture formats via FOURCC. This will enable HW acceleration for those formats when driver is capable (most of them are). Note that SDL's IYUV maps I420 FOURCC on Woe.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404236
Currently SDL uses GL_RGB for internalFormat when GL_YCBCR_MESA is passed as format for glTextImage2D when using Linux Mesa's OpenGL. However this is wrong and makes glTextImage2D fail with invalid argument error. GL_YCBCR_MESA should be also internalFormat (not GL_RGB) there and this is what can be found googling various source codes using GL_YCBCR_MESA.
--HG--
extra : convert_revision : svn%3Ac70aab31-4412-0410-b14c-859654838e24/trunk%404235