123
-=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- (c) WidthPadding Industries 1987 0|532|0 -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=-
Socoder -> Off Topic -> "New" OpenGL

Sun, 09 Jun 2013, 18:48
HoboBen
Christ, the effort you need to put into to get a textured square using the new OpenGL is insane. Yes it’s a very very fast textured square, but christ...

-=-=-
blog | work | code | more code
Sun, 09 Jun 2013, 18:48
Jayenkai
I'm too lazy to do that shit anymore! How bad is it?

-=-=-
''Load, Next List!''
Sun, 09 Jun 2013, 19:17
HoboBen
You used to just be able to use the fixed functions from whatever language, like

GL_Begin(GL_QUAD)
GL_Vertex2i(0, 0);
GL_Vertex2i(100, 0);
...
GL_End()

The problem was the graphics card spend all that time waiting for the CPU to run those functions.

Now you have to write two shaders, bind a load of arrays and variables to them with a really obtuse API, and compute projection matrices yourself. While running into driver bugs, 10 different versions of shading language, and completely wrong documentation.

All the data is sent to the GPU and in the end your main loop is literally only two lines. And all the vertex data is already uploaded to the CPU.

glUseProgram(shaderProgram);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);


But the setup is HORRENDOUS!


-=-=-
blog | work | code | more code
Sun, 09 Jun 2013, 20:15
CodersRule
How I hate writing shaders... those things put me off of game development pretty effectively.

I was just watching someone write a game for a game jam on a livestream, and his game had a problem on my system. We spent a bit debugging, and it was, of course, a shader bug that appeared only on AMD cards.
Sun, 09 Jun 2013, 23:18
HoboBen
Arghhh apparently GLSL doesn't allow variable indexes to an array except in very specific circumstances ARGHHH

No errors or anything, it just silently fails!!!

edit: Also I say "New" OpenGL. OpenGL 3 (which I'm writing to) was released five years ago!!!

-=-=-
blog | work | code | more code
Mon, 10 Jun 2013, 02:11
Jayenkai
New is new! Age doesn't matter. To some of us, (me!!) even trying certain things out in Blitz3D would count as New!

-=-=-
''Load, Next List!''
Mon, 10 Jun 2013, 03:14
Cower
Really, most of the effort is just the pain in the neck math code that needs writing (unless you use glm or something). Shaders are simple, you should validate them in testing, check compile and link logs, etc., but the actual process of writing and using one is minimal.

Core GL 3.2 and above is a bit of a pain because of vertex array objects, but they save a lot of time once you get used to them, since you just define one and reuse it over and over. So, no more reminding attributes or buffers, you just bind one of those and draw.
Mon, 10 Jun 2013, 06:08
HoboBen
Yeah finally sort of got the hang of it... after four days!

Got stuff working, including looking up colors from a 1 dimensional array to multiply with the texture.

And then even fancy distortion effects like a sine wave offset when sampling the texture in the fragment shader.

If only there weren't so many buggy functions that you need to work around!!!

-=-=-
blog | work | code | more code
Mon, 10 Jun 2013, 06:14
Jayenkai
Surely that's half the fun, though, right!?

I dunno, I really enjoyed my time "Learning with Monkey".
Now that I've got the Framework up and running, and can whizz out games within a couple of hours, it's all gotten a bit samey.
I miss the days when I was going crazy, trying to figure out how to achieve the simplest of tasks!

-=-=-
''Load, Next List!''
Mon, 10 Jun 2013, 06:18
HoboBen
Time for you to learn New OpenGL then!!!

But you're right, it's a great feeling when you finally crack it.

-=-=-
blog | work | code | more code
Mon, 10 Jun 2013, 06:18
Afr0
I miss the days when I was going crazy, trying to figure out how to achieve the simplest of tasks!


Sigh, I already told you - the best way to do that is to abandon AGW/AGM and start a big project. With Project Dollhouse I'm learning new things all the time.
Stop making "silly little games" (your quote) and start going serious!

As an example, yesterday I learned that a byte cannot possibly store more than the number 255 unless you start packing bits. I should have realized that a long time ago, but apparently I hadn't when I decided that it was enough to use a byte to represent the length of a packet (and another one to represent the decrypted length of a packet).
I cursed myself to hell and back yesterday, but damn what a nice feeling it was when I fixed it!

-=-=-
Afr0 Games

Project Dollhouse on Github - Please fork!
Mon, 10 Jun 2013, 06:25
Jayenkai
You oughta see this week's "silly little game"... It's fucking awesome!!!!

-=-=-
''Load, Next List!''
Mon, 10 Jun 2013, 06:38
Afr0
For instance... you already have a franchise going with Platdude. That's not a bad idea, I'll give you that. Blizzard made a franchise out of Warcraft before they made WoW.
So now you should retire and make Platdude World Online

If successful, you can even hire authors (like Cower!) to write books about the universe, like Blizzard has!
And you can branch out and design Platdude Lego!

Note to self: I should stop doing this for free, and get a paid gig as a top-level marketing consultant.

-=-=-
Afr0 Games

Project Dollhouse on Github - Please fork!
Mon, 10 Jun 2013, 07:56
9572AD
Why should MMO be some sort of ultimate goal? It's just a style of game, it's not intrinsically better. I don't even care for that style of game, myself.

-=-=-
All the raw, animal magnetism of a rutabaga.
Mon, 10 Jun 2013, 07:58
Jayenkai
To take this topic back to it's original topic....

Once you've got a decent framework in place, all should be relatively simple.
My past few frameworks have all used identical function names, so in many ways, a lot of my code is quite easily reusable.
Does this "New" GL allow for that sort of thing, or is it too drastically different?

-=-=-
''Load, Next List!''
Mon, 10 Jun 2013, 08:30
HoboBen
I suppose it depends. Like, you can base your framework drawing images on individual super-optimised squares and it'll basically be the same sort of performance that OpenGL used to have.

But if you've got something of fixed position & size, like a screen full of 2D tiles, then it makes sense to take the effort to put all the information into a few big huge arrays so that it can upload most of the information just the once at the start, and on each loop process the whole lot with one function call.

-=-=-
blog | work | code | more code
Mon, 10 Jun 2013, 08:51
Afr0
Why should MMO be some sort of ultimate goal? It's just a style of game, it's not intrinsically better. I don't even care for that style of game, myself.


That wasn't even my point. Point was, if you work on a big project, you learn new things. If you keep making new games using the same framework, you don't.
Jay isn't even learning to use the new languages he's using, because he refuses to learn OOP.

-=-=-
Afr0 Games

Project Dollhouse on Github - Please fork!
Mon, 10 Jun 2013, 10:45
Cower
So the deal with "new" OpenGL (aka GL 3.2 and up, particularly the core profile) is that it deprecates the entire fixed function pipeline. This also means that a lot of stuff, more than most people realize, is gone. And the kicker is this: it's usually a tiny bit slower than doing the same thing (shaders and such) in GL 2.1 and such. You don't have to switch, most games don't — almost all recent GL games are GL 2.1 or so because it runs on more systems and driver support has been optimized to hell and back.

Using 3.2 and above won't really do much for you unless there's something new you need. For example, glTextureView in GL 4.3 is awesome, but I can't use it because I target 3.2 min and max (Mac OS only supports either 3.2 or 2.1 and doesn't let you mix APIs). Other stuff, like buffer objects, uniforms, the mainstays of rendering in GL, haven't changed and if you're using them in GL 2.1 fine, there's no reason to switch.

Again, as far as I'm concerned, the main hurdle is replacing the deprecated math functions, because that's no fun to code. I have my own set of 3D math classes for this, glm is another option, etc. so that's a mostly solved problem depending on how you want that to work. Everything else is just getting used to the properties of shaders, and there are only three to learn, so that's easy enough.

Anyway, the transition isn't as big or bad as it sounds, it just requires a different approach than the old fixed function one.