Slick Forums

Discuss the Slick 2D Library
It is currently Sun Dec 17, 2017 6:15 am

All times are UTC




Post new topic Reply to topic  [ 6 posts ] 
Author Message
PostPosted: Thu Jun 20, 2013 6:49 am 
Offline

Joined: Tue Oct 11, 2011 7:30 pm
Posts: 32
MatthiasM

Thanks for the great conversation yesterday on IRC, learnt quite a bit from that.

I've had a look in more detail of your createDecoderTexture method, one thing I disagree with is this part

Code:
 private LWJGLTexture createDecoderTexture(URL textureUrl, TextureDecoder dec, LWJGLTexture.Format fmt, LWJGLTexture.Filter filter, TexturePostProcessing tpp) throws IOException {
        dec.open();
        try {
            fmt = dec.decideTextureFormat(fmt);
            if(fmt == null) {
                throw new NullPointerException("TextureDecoder.decideTextureFormat() returned null");
            }


By having the decideTextureFormat method which takes LWJGLTexture.Format s a parameter you have now made every decoder hard coded to use LWJGLTexture.Format ie its fixed only for this renderer, the decoders should be renderer independent as they decode files not decide how they are rendered to the screen. In my attempt to solve this, before I found out you did this, I changed things so you passed the String of the format to this createDecoderTexture method (ie did not convert to final enum) and was flipped to LWJGLRenderer.Format after the method returned

I personally would change the above code to

Code:
 private LWJGLTexture createDecoderTexture(URL textureUrl, TextureDecoder dec, String formatStr, LWJGLTexture.Filter filter, TexturePostProcessing tpp) throws IOException {
        dec.open();
        try {
            formatStr = dec.decideTextureFormat(formatStr);

            if(formatStr == null) {
                throw new NullPointerException("TextureDecoder.decideTextureFormat() returned null");
            }

            LWJGLTexture.Format fmt = LWJGLTexture.Format.COLOR;
            try
            {
                fmt = LWJGLTexture.Format.valueOf(formatStr.toUpperCase(Locale.ENGLISH));
            }
            catch(IllegalArgumentException ex)
            {
                getLogger().log(Level.WARNING,"Unknown texture format: {0}",formatStr);
            }



This would also mean TextureDecoder could move to the renderer package and the same decoder could be used for android for instance if there was an android renderer.

The decode method

Code:
    public void decode(ByteBuffer buf, int stride, LWJGLTexture.Format fmt) throws IOException;


also doesn't need fmt as you already have it from the previous method call, nor stride (the decoder should know this already!). And you could even just let decode return a ByteBuffer and let it do all the construction of the ByteBuffer. Gives more power to the decoder.

Personally I would remove the decideTextureFormat method, have decode passed the string version of the format and have a getStride(), getData(), getWidth(), getHeight() method on the decoder to use for further processing. But just my view :D

I know you will probably not want to do this too and that is to remove the png formats from LWJGLTexture.Format enum. This Format enum should be decoder independent.


Top
 Profile  
 
PostPosted: Sat Jun 22, 2013 7:25 am 
Offline
Slick Zombie

Joined: Fri Jan 29, 2010 7:02 pm
Posts: 1242
The decoder support is part of the renderer implementation - not of the renderer API. Different renderer may have other sets of available formats.
Putting the enum in the API would restrict their implementation, and using a String would be move the error checking to runtime.
The currently defined enums in LWJGLTexture.Format are these which are supported by that class for uploading to OpenGL.

_________________
TWL - The Themable Widget Library


Top
 Profile  
 
PostPosted: Sun Jun 23, 2013 6:54 am 
Offline

Joined: Tue Oct 11, 2011 7:30 pm
Posts: 32
MatthiasM wrote:
The decoder support is part of the renderer implementation - not of the renderer API.

I'd have to disagree here, what has the renderer implementation got to do with a decoder? If I have an image format and I create a decoder to read this in as a texture in TWL then my primary concern is to, like in any opengl implementation, to create a ByteBuffer and to define some simple metrics such as format, width, type, etc. How you actually render this on screen is up to the renderer.

If I decode an image it doesn't matter if I render in android, lwjgl, jogl because they can all use the same metrics and the same dataset to render how they like. MVC design pattern splits out the view and the controller on purpose specifically for this.

MatthiasM wrote:
Different renderer may have other sets of available formats. Putting the enum in the API would restrict their implementation

Not really, if a decoder wants to add a new format then you would need to add new functionality to LWJGLTexture anyway

MatthiasM wrote:
, and using a String would be move the error checking to runtime.

Then add a generic enum to the TextureDecoder interface and check where you check now and have the LwjglTexture class convert to the lwjgl formats.
Adding a format in the XML of the theme is a little weird anyway as the format should be deduced by the decoder which is reading the image file. Normally if there are multiple output types the header in the image file determines what is contained


Top
 Profile  
 
PostPosted: Sun Jun 23, 2013 7:13 am 
Offline
Slick Zombie

Joined: Fri Jan 29, 2010 7:02 pm
Posts: 1242
The format in the XML is basically only "COLOR" or "ALPHA" and is used only as a hint if a monochrome image is supplied.

Also there are renderer implementations which don't work with ByteBuffers (eg Java2D based ones use BufferedImage).

_________________
TWL - The Themable Widget Library


Top
 Profile  
 
PostPosted: Sun Jun 23, 2013 7:28 am 
Offline

Joined: Tue Oct 11, 2011 7:30 pm
Posts: 32
The decode method only supports ByteBuffer

If you only support COLOR and ALPHA then the problems of strings really goes away.


Top
 Profile  
 
PostPosted: Sun Jun 23, 2013 8:26 am 
Offline
Slick Zombie

Joined: Fri Jan 29, 2010 7:02 pm
Posts: 1242
The reason why the decode method only supports ByteBufefr is because it's specific for the LWJGLRenderer implementation and not a generic interface on the renderer abstraction layer.

You can of course write your decoders in a way that they can be used by multiple renderers (might require an additional layer, or bundling of the interface classes for the different distributions).

_________________
TWL - The Themable Widget Library


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 6 posts ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group