SDL2 Smooth texture(sprite) animation between points in time function

江枫思渺然 提交于 2020-12-04 03:50:34

问题


currently im trying to develop smooth animation effect via hardware accelerated technique (DirectX or OpenGL), my current goal is very simple, i would like to move texture from point A to point B in given duration, this is classic way to animate objects,

i read a lot about Robert Penner interpolations, and for this purpose i would like to animate my texture in simpliest linear interpolation as described here: http://upshots.org/actionscript/jsas-understanding-easing

Everything works, except that my animation is not smooth, it is jerky. The reason is not frame dropping, it is some double to int rounding aspects,

i prepared very short sample in C++ and SDL2 lib to show that behavior:

#include "SDL.h"

//my animation linear interpol function
double GetPos(double started, double begin, double end, double duration)
{
    return (end - begin) * (double)(SDL_GetTicks() - started) / duration + begin;

}

int main(int argc, char* argv[])
{
    //init SDL system
    SDL_Init(SDL_INIT_EVERYTHING);

    //create windows
    SDL_Window* wnd = SDL_CreateWindow("My Window", 0, 0, 1920, 1080, SDL_WINDOW_SHOWN | SDL_WINDOW_BORDERLESS);

    //create renderer in my case this is D3D9 renderer, but this behavior is the same with D3D11 and OPENGL

    SDL_Renderer* renderer = SDL_CreateRenderer(wnd, 0, SDL_RENDERER_ACCELERATED | SDL_RENDERER_TARGETTEXTURE | SDL_RENDERER_PRESENTVSYNC);

    //load image and create texture
    SDL_Surface* surf = SDL_LoadBMP("sample_path_to_bmp_file");

    SDL_Texture* tex = SDL_CreateTextureFromSurface(renderer, surf);

    //get rid of surface we dont need surface anymore
    SDL_FreeSurface(surf);

    SDL_Event event;
    int action = 0;
    bool done = false;

    //animation time start and duration
    double time_start = (double) SDL_GetTicks();
    double duration = 15000;

    //loop render
    while (!done)
    {
        action = 0;
        while (SDL_PollEvent(&event))
        {
            switch (event.type)
            {
            case SDL_QUIT:
                done = 1;
                break;
            case SDL_KEYDOWN:
                action = event.key.keysym.sym;
                break;
            }
        }

        switch (action)
        {
        case SDLK_q:
            done = 1;
        default:
            break;
        }

        //clear screen
        SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);
        SDL_RenderClear(renderer);

        //calculate new position
        double myX = GetPos(time_start, 10, 1000, duration);

        SDL_Rect r;

        //assaign position
        r.x = (int) round(myX);
        r.y = 10;
        r.w = 600;
        r.h = 400;

        //render to rendertarget
        SDL_RenderCopy(renderer, tex, 0, &r);

        //present
        SDL_RenderPresent(renderer);


    }

    //cleanup
    SDL_DestroyTexture(tex);
    SDL_DestroyRenderer(renderer);
    SDL_DestroyWindow(wnd);


    SDL_Quit();

    return 0;
}

i suppose that jerky animation effect is related to my GetPos(...) function which works with doubles values, and im rendering via int values. But i cant render to screen in double because i obviously can't draw at 1.2px, My question is: do you know any technique or do you have some advice how to make that kind of animations (from, to, duration) smooth without jerky effect? Im sure that's definitely possible because frameworks like WPF, WIN_RT, Cocos2DX, AndroidJava all them supports that kind of animations, and texture/object animation is smooth, thanks in advance

edit as per @genpfault request in comments im adding frame by frame x position values, as int and double:

rx: 12    myX: 11.782
rx: 13    myX: 13.036
rx: 13    myX: 13.366
rx: 14    myX: 14.422
rx: 16    myX: 15.544
rx: 17    myX: 16.666
rx: 18    myX: 17.722
rx: 19    myX: 18.91
rx: 20    myX: 19.966
rx: 21    myX: 21.154
rx: 22    myX: 22.21
rx: 23    myX: 23.266
rx: 24    myX: 24.388
rx: 25    myX: 25.444
rx: 27    myX: 26.632
rx: 28    myX: 27.754
rx: 29    myX: 28.81
rx: 30    myX: 29.866
rx: 31    myX: 30.988
rx: 32    myX: 32.044
rx: 33    myX: 33.166
rx: 34    myX: 34.288
rx: 35    myX: 35.344
rx: 36    myX: 36.466
rx: 38    myX: 37.588
rx: 39    myX: 38.644

final update/solve:

  1. I changed question title from DirectX/OpenGL to SDL2 because issue is related to SDL2 it self,
  2. I marked Rafael Bastos answer as correct because he pushed me into right direction, issue is caused by SDL render pipeline which is based on int precision values
  3. As we can see in above log - stuttering is caused by irregular X values which are rounded from float. To solve that issue i had to change SDL2 render pipeline to use floats instead of integers
  4. Interesting is that, SDL2 internally for opengl,opengles2, d3d9 and d3d11 renderers uses floats, but public SDL_RenderCopy/SDL_RenderCopyEx api is based on SDL_Rect and int values, this causing jerky animation effects when animation is based on interpolation function,

What exactly i changed in SDL2 is far far beyound stackoverflow scope, but in next steps i writed some main points what should be done to avoid animation stuttering:

  1. i moved SDL_FRect and SDL_FPoint structs from internal sys_render api to render.h api to make them public
  2. i extended current SDL methods in rect.h/rect.c to support SDL_FRect and SDL_FPoint, such SDL_HasIntersectionF(...), SDL_IsEmptyF(...) SDL_IntersectRectF(...)
  3. i added new method GerRenderViewPortF based on GetRenderViewPort to support float precision
  4. i added 2 new method SDL_RenderCopyF and SDL_RenderCopyFEx to avoid any figures rounding and pass real floats values to internal renderers,
  5. all public functions must be reflected in dyn_proc SDL api, it requires some SDL architecture knowledge to do that,
  6. to avoid SDL_GetTick() and any other timing precision issues, i decided to change my interpolation step from time to frame dependency. For example to calculate animation duration im not using:

    float start = SDL_GetTicks();  
    float duration = some_float_value_in_milliseconds;  
    

    i replaced that to:

    float step = 0;
    float duration = some_float_value_in_milliseconds / MonitorRefreshRate
    

    and now im incrementing step++ after each frame render
    of course it has some side effect, if my engine will drop some frames, then my animation time is not equal to duration because is more frame dependent, of course this duration calculations are valid only when VSYNC is ON, it is useless when vblank is off,

and now i have really smooth and jerky free animations, with timeline functions,
@genpfault and @RafaelBastos thanks for your time and for your advices,


回答1:


seems you need to subtract started from SDL_GetTicks()

Something like this:

(end - begin) * ((double)SDL_GetTicks() - started) / duration + begin

(end - begin) gives you the total movement

(SDL_GetTicks() - started) / duration gives you the interpolation ratio, which multiplied by the total movement will give you the amount interpolated, which needs to be summed to the begin portion, so you can have the absolute interpolated position

if that's not it, then it is probably a rounding issue, but if you can only render with int precision, then I think you need to bypass sdl and render it using plain opengl or directx calls, which allow floating precision.



来源:https://stackoverflow.com/questions/36746246/sdl2-smooth-texturesprite-animation-between-points-in-time-function

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!