opengl

Why is my OpenGL program unable to load the GLUT bitmap fonts?

筅森魡賤 提交于 2021-02-05 08:43:07
问题 I want to display some simple text on the HUD of my 3D environment in OpenGL. All sources say that I should use the bitmap fonts included in GLUT, however, my program can't seem the find/load the fonts. I have included all of the correct libraries and have double checked that the fonts.py file is definitely in the ...\Python37\...\OpenGL\GLUT\ directory, but when I type GLUT_BITMAP_TIMES_ROMAN_24 (or any of the other bitmap fonts), it is highlighted as an error in my code. This is the

glDrawArrays only updates when i exit

♀尐吖头ヾ 提交于 2021-02-05 08:29:50
问题 I have this code in python3 which doesn't work on windows machines but it worked on a linux machine. I draw a green screen and a red triangle but the red triangle only appears when I exit. import pygame import numpy import OpenGL.GL as gl import OpenGL.GL.shaders as shaders from pygame.rect import Rect RED = (255, 0, 0) WHITE = (255, 255, 255) pygame.init() screen = pygame.display.set_mode((800, 600), pygame.OPENGL) vertes_shader = """ #version 330 in vec4 position; void main() { gl_Position

Implicit synchronization with glUniform*

倖福魔咒の 提交于 2021-02-05 08:00:47
问题 Is there an implicit synchronization in the following GL code? glClear(GL_COLOR_BUFFER_BIT); glUseProgram(prg); glUniform1f(loc, a); glDrawArrays(GL_TRIANGLES, 0, N); glUniform1f(loc, b); // <-- Implicit synchronization?? glDrawArrays(GL_TRIANGLES, 0, N); swapBuffers(); 回答1: As always, OpenGL implementations can handle things the way they like, as long as it results in the correct behavior. So there's really no way to say how it has to work. That being said, updating uniforms is a common

Why does my wired sphere turn into ellipsoid when translating and changing camera angle?

十年热恋 提交于 2021-02-05 07:58:27
问题 I need to translate my wired sphere along z-axis back and forth while also changing camera angle. Whenever my sphere gets translated, it slowly turns into ellipsoid. I really don't understand why. Here you can see pieces of code where I believe is a mistake. Also, shapes shouldn't be changed when resizing the window, only their size. void init() { glClearColor(0.0, 0.0, 0.0, 0.0); glEnable(GL_DEPTH_TEST); } void display(void) { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glPushMatrix(

Why does my wired sphere turn into ellipsoid when translating and changing camera angle?

不想你离开。 提交于 2021-02-05 07:56:11
问题 I need to translate my wired sphere along z-axis back and forth while also changing camera angle. Whenever my sphere gets translated, it slowly turns into ellipsoid. I really don't understand why. Here you can see pieces of code where I believe is a mistake. Also, shapes shouldn't be changed when resizing the window, only their size. void init() { glClearColor(0.0, 0.0, 0.0, 0.0); glEnable(GL_DEPTH_TEST); } void display(void) { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glPushMatrix(

Segmentation fault when including glad.h

懵懂的女人 提交于 2021-02-05 07:13:25
问题 I am following the GLFW guide to getting started but I can't seem to make it run with GLAD. Here's my C file (prac.c) #include <stdio.h> #include <stdlib.h> #include<glad/glad.h> #include<GLFW/glfw3.h> void error_callback(int error, const char* description) { fprintf(stderr, "Error %d: %s\n", error, description); } int main(void) { GLFWwindow* window; if(!glfwInit()) return -1; glfwSetErrorCallback(error_callback); glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); glfwWindowHint(GLFW_CONTEXT

How do OpenGL's buffers work?

末鹿安然 提交于 2021-02-05 05:25:41
问题 I don't understood how do OpenGL's buffers work. I learn OpenGL, by means of OpenGL Redbook 8th edition. For example, I have an array of position, an array of color and an array of indices: static const GLfloat strip_position[] = { -4.0f, 0.0f, -1.0f, 1.0f, //0 -3.5f, -1.0f, -1.0f, 1.0f, //1 -3.0f, 0.0f, -1.0f, 1.0f, //2 -2.5f, -1.0f, -1.0f, 1.0f, //3 -2.0f, 0.0f, -1.0f, 1.0f, //4 -1.5f, -1.0f, -1.0f, 1.0f, //5 -1.0f, 0.0f, -1.0f, 1.0f, //6 -0.5f, -1.0f, -1.0f, 1.0f, //7 0.0f, 0.0f, -1.0f, 1

How do OpenGL's buffers work?

 ̄綄美尐妖づ 提交于 2021-02-05 05:25:31
问题 I don't understood how do OpenGL's buffers work. I learn OpenGL, by means of OpenGL Redbook 8th edition. For example, I have an array of position, an array of color and an array of indices: static const GLfloat strip_position[] = { -4.0f, 0.0f, -1.0f, 1.0f, //0 -3.5f, -1.0f, -1.0f, 1.0f, //1 -3.0f, 0.0f, -1.0f, 1.0f, //2 -2.5f, -1.0f, -1.0f, 1.0f, //3 -2.0f, 0.0f, -1.0f, 1.0f, //4 -1.5f, -1.0f, -1.0f, 1.0f, //5 -1.0f, 0.0f, -1.0f, 1.0f, //6 -0.5f, -1.0f, -1.0f, 1.0f, //7 0.0f, 0.0f, -1.0f, 1

Platform specific macros in OpenGL headers

和自甴很熟 提交于 2021-02-04 20:49:27
问题 I was parsing gl.h and I noticed a quite unusual (at least to me) way of declaring openGL functions. Here is an example: GLAPI void APIENTRY glEvalCoord1d( GLdouble u ); GLAPI void APIENTRY glEvalCoord1f( GLfloat u ); GLAPI void APIENTRY glEvalCoord1dv( const GLdouble *u ); Obviously, those are regular functions with return type void, but my questions is about the effect of GLAPI and APIENTRY macros. Those two are platform specific, declared in the beginning of the header: /* GLAPI, part 1

OpenGL default pipeline alpha blending does not make any sense for the alpha component

ε祈祈猫儿з 提交于 2021-02-04 14:47:07
问题 Q : Is there a way to use the default pipeline to blend the Alpha component properly? Problem : I'm drawing semi-transparent surfaces into a texture, then I want to blit that texture into the main frame back buffer. Normally when you use straightforward alpha blending for transparency or antialiasing, it is to blend color components into the back buffer ( which has no alpha channel by deafult ). However, when blending into a texture that has an alpha component, the usual GL_SRC_ALPHA, GL_ONE