Hi Folks,
A recent issue with AA StrikeFonts reported in squeak-dev made me do some tests with BitBlt. I found that a couple of operations work ok on the simulator but give incorrect results on the VM. I guess there might be some 32 bit overflow somewhere, or some value that should be unsigned and is int. The following code:
" 20 rgbAdd:with: 21 rgbSub:with: 27 rgbMax:with 28 rgbMin:with: 29 rgbMinInvert:with: 37 rgbMul:with: " rules _ #(20 21 27 28 29 37). results _ rules collect: [ :r | f1 := Form extent: 8@8 depth: 32. f1 fillColor: (Color r: 0.2 g: 0.2 b: 0.3 alpha: 0.2). f2 := Form extent: 8@8 depth: 32. f2 fillColor: (Color r: 0.1 g: 1.0 b: 0.1 alpha: 0.1). bb :=(BitBlt toForm: f1) sourceForm: f2; combinationRule: r; copyBitsSimulated. c := f1 colorAt: 2@2. (c red roundTo: 0.001) @(c alpha roundTo: 0.001) ].
Answers the following: #(0.297@0.298 0.102@0.102 0.199@0.2 0.098@0.098 0.199@0.2 0.02@0.02) Each point is for one of the tested rules. The x is the value for red, and the y is the value for alpha. I this case, the values are the same (or at least very close).
But if I change #copyBitsSimulated for #copyBits I get: #(0.297@1.0 0.102@0.102 0.199@0.2 0.098@0.098 0.199@0.902 0.02@0.02)
Note that rgbAdd and rgbMinInvert have completely wrong values. I guess this has to do with alpha being the MSB in each word, and perhaps there is some overflow, but looking at the code, and running some tests in the simulator I could not find it. I didn't go as far as using a C debugger, and perhaps somebody can help.
Thanks, Juan Vuletich
vm-dev@lists.squeakfoundation.org