I am writing a ray tracer in Rust with nalgebra and I’ve been having trouble with implementing the camera for the past several days.
My goal is to make a camera behave the same as in FPS games. Which objects should I use to represent such a camera? https://gamedev.stackexchange.com/questions/19507/how-should-i-implement-a-first-person-camera
Are yaw/pitch angles and location enough?
That was the part I know how to implement, but would just like suggestions about which objects to use. Eg. for the angles - Rotation3?
The difficult part is, that I also need to cast a ray for each point on screen, originating in the camera location. I am pretty sure I would need to use quaternions for this, but I have no idea what would be the correct procedure.
What I know how to calculate:
- camera location,
- camera direction vector,
- screen width,
- screen height,
- screen point x,
- screen point y,
- the field of view (angle between the top left screen point, the camera location and the bottom right screen point),
- distance of camera from the screen center
What I would like to calculate:
- the directional vector to each point on screen from the camera location
Any help is much appreciated.