View Full Version : How to convert from Trigonometric coordinate to Cartesian coordinate

andre_teprom

10-27-2016, 08:46 AM

Apologise if the question is too trivial onthis forum, but I'm completely noob on this library. I'm about to upgrade a system that had to previously convert <x,y,z> vectors into <angular> vectors by pure math, dot by dot. The system consists of a mechanical arm, and the board computes the coordinate transformation by trigonometric calculations. However, the new board has a GPU inside, which has the OpenGL featured, so I wish know what is the function of the library that I should use to do that.

Thanks in advance.

BBeck1

10-28-2016, 05:55 AM

I didn't even realize there was a different way to write vectors like this. lol But I'm pretty good at digging for answers.

I assume you mean that you want to change the vectors into "spherical vectors" as described here (https://en.wikipedia.org/wiki/Vector_notation). I found the math (http://mathworld.wolfram.com/SphericalCoordinates.html) to do it. It looks pretty straight forward and you could write your own function to do it. But GLM is the math library for OpenGL and I think it may have some functions for this (https://glm.g-truc.net/0.9.2/api/a00285.html). The conversion of "euclidean" coordinates to "polar" coordinates might be what's needed in 3D space.

You may also want to google "compute shader polar coordinates" to look into doing this on the GPU in GLSL. Compute shaders are a thing now, where you use the GPU to do math instead of graphics. If that's what you want, you should probably be looking at writing a compute "shader" in GLSL. You could try googling something like "glsl compute shader example".

andre_teprom

10-30-2016, 01:02 PM

Many thanks !

The function tvec3 does exactly what I need.

Powered by vBulletin® Version 4.2.3 Copyright © 2018 vBulletin Solutions, Inc. All rights reserved.