1
tueor
4y

If I make a character oscillate on my screen, and match its time period to my camera's fps, it should seem still on my camera in theory right? I tried to implement it like this: https://www.paste.org/115774 but I can't figure out what could be going wrong, my guesses are either I'm expecting too much precision, or my camera's fps fluctuates a little, either way, please let me know what you think could be going wrong.

Comments
  • 6
    The link doesn't work (access denied), but the answer is that you will always have deviations in frequency if you don't have a common time base. The systems will always run out of sync.

    That's also why e.g. in data transmission, you generate the relevant clock from the data stream itself.
  • 0
    @Fast-Nop ughhh yeah, paste.org has been glitchy lately, behold the epitome of code sharing : https://walloftext.co/devrant

    I'm don't entirely get what you're saying, do you mean that my mobile and my computer will have some difference at the smallest units of time?

    I was trying to replicate this btw: https://youtu.be/_xTjyV8F6XU
  • 1
    @tueor yes, the frames are not in sync between the display and the camera. So your camera is either a bit too fast or a bit too slow and also captures a midframe of the display, when one frame ends and new one starts displaying.
  • 0
    @iiii the only option that remains now is to start my program with 1 nanosecond and increment it once each time I press some key, only around 666666 possibilities to check, the brutest force in the entire universe
  • 3
    @tueor The time measurement in computers is based on quarz oscillators, and no two oscillators have exactly the same frequency.

    On top of that, the nanosleep function doesn't actually sleep for the exact amount of time that you hand over. It just sleeps for at least this duration. That's because other processes may also run, and the time slice resolution of the OS is also a factor.

    On top of that, the delay you're giving aims at a 1.5kHz frequency - computer monitors only refresh their image at usually 60Hz (some gaming ones can do more).

    Yes, that helicopter effect is well-known e.g. from old Wild West movies with rotating horse carriage wheels.
  • 1
    @Fast-Nop Thank you for your explanation, I am convinced to give up

    Also I did consider the refresh rate, I thought that even if the "o" gets printed at a higher frequency than the screen's refresh rate, it lands up at the same spot right when the screen is refreshed, considering a complete oscillation of o takes t time, the screen refreshes once at t/2 and once at t, but I just realised according to that it should be visible only at 2 spots

    Sigh, this was just an impossible/too hard to implement of an idea
  • 2
    I would first get the display to show different pictures every other frame. Then make your camera take pictures with half the frame rate of the display. Make sure the shutter speed is not so long that you get a picture of both frames. You should be able to see the effect well even if the clocks are not perfectly in sync.
  • 2
    A few problems:

    - You can't get a precise timing when you run inside of an OS. It would help but still not solve it, even when you us a RT Kernel, (see Linux RT). 1.5 kHz is too fast.

    - AMD64 is bad for RT software, cache misses, frequency changes and indeterministic interrupts that pause your instruction counter for several μs are a problem.

    - Your screen may don't update every pixel at the same time.

    If you can, choose dedicated simple hardware for RT applications, like a microcontroller or, if you need faster reaction time, FPGAs.
  • 2
    Don't use printf(). printf() may allocs or frees memory, which can create page misses, which create context switches and long delays for assigning new pages from the kernel to the application. Use write() and read() or any other syscall that does not use buffering. Use only one buffer.

    Your terminal is probably not capable for RT applications. Us some other output that does not need your screen.

    You need absolute time measurements, like clock_gettime() CLOCK_MONOTONIC
  • 1
    @electrineer Just to make sure I understand you correctly, what you're saying is that I should have a 60Hz monitor with a 30Hz camera, and if I display different images every .016 seconds, I should be able to see this effect?
  • 0
    @happygimp0 Thank you for your explanation, I'll try to go down the microcontroller/other output path if time allows.

    And the printf() causing context switches/page misses problem, do you mean that in context of RT systems? I thought those things are unavoidable so I never think in terms of those while programming, but it makes sense in a RT system.
  • 1
    Use webcam.
  • 0
    @blindXfish for a lower fps?
  • 1
    @tueor webcam is running on the same sys. So Basically the time() and the now() will be the same, on the video and the program. you can even interconnect and choose what frames you want exactly.
  • 1
    Or use custom webcam driver where you can align the framerate instead of aligning the software.
  • 1
    I don't know which implementation of printf() you use. It could be that printf() calls malloc() which can create page misses. But it does not have to be that way. Context switches are unavoidable, but keep them to a minimum and avoid allocation after the application is started.

    Microcontroller and a LED matrix could be fast enough, when addressed parallel. This could work even when the clock is 8 or 16 MHz. Do you know how to build a fast enough LED matrix?
  • 0
    @happygimp0 Thanks again, and no, I have zero idea about that, but I'll look it up.
  • 0
    Do you know the LED text fans? You idea reminds me of that and it works without a matrix, the LEDs are all in a single row.
Add Comment