- the video-pipeline in cocoa layers now goes through Layer::feed() and Layer::do_filters()
- image parameters are applied by the layer controller when a new frame is provided
by the underlying CVLayer through feedFrame
- moving filter-related code out of the layercontroller
by creating a Filter subclass : CVFilter
- making coreimage filters available to javascript when running on osx
Note : this is a work in progress and implementation is still unfinished
(coreimage filters won't work in this revision)
refactoring the feedFrame/renderFrame/getTexture flow to make a more
clever use of memory and autorelease pools
started implementing CVFilter
(on the way of having CoreImage filters usable from javascript)
obsoleted since long, it cleans up the Screen api
eventually we might want to have more screens
with magnification in future, algos like those
found in Mame, h2x and such
any subclass of CVCocoaLayer with a controller associated to it can now
be controlled through the filterpanel.
The filter panel for a specific layer can now be opened by
double-clicking on the layer name in the listview.
a bit of refactoring in CVFilterPanel/CVLayerView/CVLayerController
has been done to allow viewless layers to be created and eventually
controlled through a filterpanel
I'm leaving calls to JS_SetContextThread() and JS_ClearContextThread
commented-out because we still need to carefully check what's going to
be called asynchrounosly from different threads
(I'll remove superfluous onse later on)
the most serious fix here was to flag the cocoa keyboardcontroller as
indestructible, to avoid crashes when loading a new script after a reset()
(the keyboardcontroller was destroyed while still used as a singleton,
so the new script was referencing a dead instance)