Jump to content

C, C++, C# Programming - what is the sense in this


PB666

Recommended Posts

For starters, you can write an operating system in FORTRAN.

Not with any of the parts of FORTRAN77 I learned. I understand that FORTRAN since then (i.e. F90 and later) may have things like pointers (that weren't fixed to arrays), but F77 just wasn't made for that sort of thing (and C was, especially since Richie was also busy creating UNIX). There were other parts of the language that appeared insane (call by value, lack of recursion) but may have been leftover from FORTRAN66*. If you couldn't write an OS in it till the 1980s, it was too late: Unix had taken over.

The old claim used to be "I don't know what language will be used for scientific calculation in 20 years, but it will be called FORTRAN". I'm surprised that isn't the case anymore. I'm also wondering at what point sci.py [scientific python] will be able to directly compute things without using C [for large enough arrays]. I think that is the next big push for pypy.

* When I was learning FORTRAN programming on UNIVAC 11xx series machines in the 1980s (presumably to discourage engineering students) I discovered in my basement a UNIVAC 11xx programming manual (probably for FORTRAN66, but I can't remember and didn't carefully save the binder like I should have). It took me awhile to realize that it wasn't mine, it was my mother's from a job from before she had me.

Link to comment
Share on other sites

Fortran is dying out in labs, and for a very good reason. There are much better libraries for C/C++ these days, and most of the old code written in Fortran, including some of the libraries, are absolutely atrocious. Specifically because in F77, it was very difficult to avoid bad programming practices, and while F90 is quite decent, many Fortran programmers carried their bad habits over. To put it plainly, I've seen goto statements in F90 libraries. That's bad. That shouldn't happen.

Like I've said earlier, though, Fortran could have easily been where C is if there was no C. Out of languages currently out there, barring all C-like languages, Fortran is the next best choice for building OSes and computation libraries.

GOTO statements were how people used to program, that's how you used to loop. You assumme that just because one can have bad habits they do. part of the evolution was That MS got ridcof Gosub, replaced with subs. They added type definitions, they added option explicit. they added functions, far better designed than the C# functions. All these things can be used if the user chose to, you don't have to have bad practices, in any language.

Edited by PB666
Link to comment
Share on other sites

The problem I had with VB was its compiler... it was terrible at optimizing. I didn't care for the framework either for managing overlays, I found it a pain in the butt... that problem could also extend itself into the linker used as well. I also found that people would use VB exclusively for accessing the Windows API, allowing, or rather relying, on Windows to do a lot of the work. That was all fine and well for coming up with a Windows-like familiar user interface, but Windows is - well - sloppy in of itself, and slow. It was because of these things that a number of the houses I did work with opted for 4GLs instead... Fox, Clipper, Recital, etc.

Link to comment
Share on other sites

The problem I had with VB was its compiler... it was terrible at optimizing. I didn't care for the framework either for managing overlays, I found it a pain in the butt... that problem could also extend itself into the linker used as well. I also found that people would use VB exclusively for accessing the Windows API, allowing, or rather relying, on Windows to do a lot of the work. That was all fine and well for coming up with a Windows-like familiar user interface, but Windows is - well - sloppy in of itself, and slow. It was because of these things that a number of the houses I did work with opted for 4GLs instead... Fox, Clipper, Recital, etc.

Yes the gui is nice but the intricate parts of VB are the sloppiest parts, particularly the scrollable lists, i only use one active X control and that is in excel, throws me up a code box and a away to start and capture transfer bugs. The VB sourced .exe files are not standalone, so its better not to use the gui if possible. The problem since vista is that unless its a background program, you almost have to use a visual interface. XP was the last MS OS to have a dos emulation mode, and it is hideously slow. Its not impossible to run text only display and entry, but it becomes stone knives and bear skins. Excel basically is a VB gui, so whats the point of making a stand alone now. I mean you can open and save a textfile from excel's VBA so at worst you need to use two programs instead of one.

Link to comment
Share on other sites

Not with any of the parts of FORTRAN77 I learned.

You'd need a few files written in Assembly to talk to the hardware, but that's true of any language. And sure, between C and F77 for writing an OS, it's C hands down. But you totally could write an OS in F77. For something simple like loading from a floppy drive and browsing a partition on the same, I could write the code in an afternoon. I'll take a look at how flash drive access works from BIOS. Maybe I can do the same with one of these as a proof of concept.

The problem I had with VB was its compiler... it was terrible at optimizing.

Because it's a terrible language, with absolutely no structure for optimizer to work with. All it's good for is not punishing you for being bad at programming, which is why it has so many fanboys.

GOTO statements were how people used to program, that's how you used to loop.

I don't have a problem with early languages mimicking jump instructions in assembly, which are reflection of architecture. I have a problem with seeing goto statements in F90 and C, which are modern languages with structured loops. If you are using goto statement in one of these, you are doing it wrong. You are a terrible programmer, and you should either learn not to do this or do something else. Not only does it make code harder to read and debug, but it makes it impossible for optimizer to do its job. There is zero excuse for using goto in modern languages.

Link to comment
Share on other sites

I wouldn't say it's a "terrible" language, it is as the name implies - Basic, and very 'basic' at that. Ease of use, and yes, forgiving. I have to say, I was glad many a time to have it around, as when times called for a quick fix to some file or database one could always whip up a quick Basic program to do the job. Of course, that was 'back then', things are a bit different now especially with databases. Still, I think one of best features of Basic is its use as an introduction tool to programming - it's easy to learn, the general principals are inherent to pretty much all other languages (C included), and once gotten a grasp of, can ease the learning curve to other more robust languages. I've sat complete computer illiterates down (really, idiots lol) and in 30 minutes time have had them writing simple, functional, Basic programs ... and the ideas / concepts that go with it stuck. Then again, maybe it's all in the way one teaches?

Without going into lengthy explanation, I'll just say that I disagree with your view on the use of a GOTO statement (in Fortran)... only because I've been there and done that, and to my experience, there have been situations where execution efficiency required its use.

Programming style' date=' like writing style, is somewhat of an art and cannot be codified by inflexible rules, although discussions about style often seem to center exclusively around such rules. In the case of the goto statement, it has long been observed that unfettered use of goto's quickly leads to unmaintainable spaghetti code. However, a simple, unthinking ban on the goto statement does not necessarily lead immediately to beautiful programming: an unstructured programmer is just as capable of constructing a Byzantine tangle without using any goto's (perhaps substituting oddly-nested loops and Boolean control variables, instead). Many programmers adopt a moderate stance: goto's are usually to be avoided, but are acceptable in a few well-constrained situations, if necessary: as multi-level break statements, to coalesce common actions inside a switch statement, or to centralize cleanup tasks in a function with several error returns. (...) Blindly avoiding certain constructs or following rules without understanding them can lead to just as many problems as the rules were supposed to avert. Furthermore, many opinions on programming style are just that: opinions. They may be strongly argued and strongly felt, they may be backed up by solid-seeming evidence and arguments, but the opposing opinions may be just as strongly felt, supported, and argued. It's usually futile to get dragged into "style wars", because on certain issues, opponents can never seem to agree, or agree to disagree, or stop arguing. - [url']https://en.wikipedia.org/wiki/Goto#Criticism
Edited by LordFerret
Link to comment
Share on other sites

goto.png

Which brings me back to my first days of learning programming where GOTO was a go-to function for more or less anything. Need to jump around the code? GOTO. Need to iterate something? GOTO. Need to delay the execution of something? GOTO.

They I moved away from Hello World and GOTO had to go to hell.

Link to comment
Share on other sites

I wouldn't say it's a "terrible" language, it is as the name implies - Basic, and very 'basic' at that. Ease of use, and yes, forgiving. I have to say, I was glad many a time to have it around, as when times called for a quick fix to some file or database one could always whip up a quick Basic program to do the job. Of course, that was 'back then', things are a bit different now especially with databases. Still, I think one of best features of Basic is its use as an introduction tool to programming - it's easy to learn, the general principals are inherent to pretty much all other languages (C included), and once gotten a grasp of, can ease the learning curve to other more robust languages. I've sat complete computer illiterates down (really, idiots lol) and in 30 minutes time have had them writing simple, functional, Basic programs ... and the ideas / concepts that go with it stuck. Then again, maybe it's all in the way one teaches?

All of the good things I can say about Basic, I can also say about PHP and Python, for example. But I can also say far, far fewer bad things about them. I can sort of buy that, oh, 30 years ago, BASIC had its place as a learning tool or a quick fix-up tool. We had it on school machines, the old variety where you had to put line numbers for each line, and it made sense. It was that or jumping straight into Turbo C or Turbo Pascal, which had a much steeper learning curve.

Today, though? There is no good reason for Basic.

Without going into lengthy explanation, I'll just say that I disagree with your view on the use of a GOTO statement (in Fortran)... only because I've been there and done that, and to my experience, there have been situations where execution efficiency required its use.

And I can point you to some libraries in Fortran that I had to give up on, because I couldn't figure out how they worked, because the entire logic was a mess of GOTO statements. I know they were just nested loops with occasional breaks, but short of printing out all 100 or so pages, decorating walls with them, and drawing arrows all over the place, there was no way to make sense of the code. Ok, so absolutely every single variable having names consisting of the same letter repeated arbitrary number of times did not help, but I could have worked around that if control paths made sense.

I was able to compile them, and cross-link them into my C code, but if I needed to change anything about that library, it was a fool's quest. So I ended up either replacing them with C libraries or writing my own.

And again, you might think you're helping execution efficiency with a goto statement, but unless you're writing an Assembly program, you're ultimately working hand-in-hand with an optimizer. And optimizers don't play well with goto loops. Were you enjoying that use of a register as a loop counter, which sped up your tightest loop by a factor of three? Well, it's gone now, because optimizer could not predict all consequences of that goto, and now the counter has to be placed on the stack.

In fact, if you think you can come up with any example where optimized code runs faster with a goto statement, I would be happy to see it. I think you'll find that your perception doesn't match behavior of modern compilers.

Link to comment
Share on other sites

I don't have a problem with early languages mimicking jump instructions in assembly, which are reflection of architecture. I have a problem with seeing goto statements in F90 and C, which are modern languages with structured loops. If you are using goto statement in one of these, you are doing it wrong. You are a terrible programmer, and you should either learn not to do this or do something else. Not only does it make code harder to read and debug, but it makes it impossible for optimizer to do its job. There is zero excuse for using goto in modern languages.

Guess all the linux coders are stupider then you are?

http://lxr.free-electrons.com/source/drivers/mtd/mtdchar.c#L61

(just open a random piece of code in the linux kernel, 80% chance you see goto's)

Never say never in programming. There are always edge cases.

Link to comment
Share on other sites

I don't have a problem with early languages mimicking jump instructions in assembly, which are reflection of architecture. I have a problem with seeing goto statements in F90 and C, which are modern languages with structured loops. If you are using goto statement in one of these, you are doing it wrong. You are a terrible programmer, and you should either learn not to do this or do something else. Not only does it make code harder to read and debug, but it makes it impossible for optimizer to do its job. There is zero excuse for using goto in modern languages.

I wouldn't automatically label every bit of code with goto statement as 'terrible'. Goto is simply jmp in assembly and under the hood many 'proper' things in a source code are compiled into jump opcode. If you view the native code all your for (...), while (...), do { ... } while (...), switch (...) case ... case ...}, break and continue are basically goto statement.

I don't use goto simply because someone told me that 'it's terrible', but if you stop and think about it, sometimes it CAN be justified.

For example:

Let's say you've got this algorithm (A, B, C, D are some operations ):

iUNJI6e.png

Let's code it the proper way:


char bf1, bf2, bf3;
if (a)
{
A;
bf1 = 1;
}
else
bf1 = 0;

bf2 = 0;
do
{
do
{
if (bf3 ||
bf3 = 1;
else
bf3 = 0;
if (bf3 || bf2)
B;
if (bf3 || bf1 || bf2)
{
C;
bf1 = 0;
bf2 = 1;
}
if (!bf3)
{
if (!c)
{
D;
bf3 = 1;
}
else
{
bf3 = 0;
bf2 = 0;
}
}
}
while (bf3);
}
while (bf2);

E;

Now, some heresy with goto:


if (a)
{
A;
goto L3;
}
L1:
if (
{
L2:
B;
L3:
C;
goto L1;
}
else if (!c)
{
D;
goto L2;
}
E;

So, which code is easier to understand and maintain?

Another example of heresy:

char a, b, c;

for (a = 0; a < 10; ++a)
{
for (b = 0; b < a; ++
{
if (!c)
goto Leave;
}
for (b = 10; b < 15; ++
{
d ();
}
}

Leave:
e ();

and the 'proper' way with a flag:


char a, b, c, f1;

f1 = 1;
for (a = 0; a < 10 && f1; ++a)
{
for (b = 0; b < a && f1; ++
{
if (!c)
f1 = 0;
}
if (f1)
{
for (b = 10; b < 15; ++
{
d ();
}
}
}

Here you add a condition check on each iteration simply because using 'goto' is TERRIBLE.

Goto, like any other language element (in any programming language) is a tool. How you use it determines if you are good or bad, but I doubt the mere fact of using it automatically labels you 'terrible programmer'.

Link to comment
Share on other sites

Too forgiving, ergo begin anything longer than 50 lines with OPTION EXPLICIT and start your structure in the declaration section. Look up the "0" vs "O" controversy in fortran I cant tell you how many nights of grief this caused on FORTRAN 4. On the basic IDE just substitute a Uppercase letter midword during a set statement and if the var is mispelled it wont auto flip back to lower case.

In vb you could get away with this:

Sub Foo()
End Sub

Sub Foo2(i As Integer, s As String)
End Function

Sub Bar()
Foo ' Note the absence of parentheses
Foo2 2, "test"
End Sub

I still have nightmares of porting this into c#.

Edited by cicatrix
Link to comment
Share on other sites

In vb you could get away with this:

Sub Foo()
End Sub

Sub Foo2(i As Integer, s As String)
End Function

Sub Bar
Foo ' Note the absence of parentheses
Foo2 2, "test"
End Sub

I still have nightmares of porting this into c#.

That is one of my peeves, the closures on statement blocks should be unique an standardized. This is the reason you do want function and subs to reduce the number of nested loops and conditionals. you can formally Call in basic but it doesn't read as well.

Link to comment
Share on other sites

That is one of my peeves, the closures on statement blocks should be unique an standardized. This is the reason you do want function and subs to reduce the number of nested loops and conditionals. you can formally Call in basic but it doesn't read as well.

It's the stupidest idea about VB.Net. It tried to maintain backward compatibility with VB6. It a) forced Microsoft marketing department to set the default setting for Option Strict and Option Explicit to Off, B) include Microsoft.VisualBasic namespace. I shudder to think what would happen if the same approach was used to make C# backwards compatible with MSVC++.

Link to comment
Share on other sites

It's the stupidest idea about VB.Net. It tried to maintain backward compatibility with VB6. It a) forced Microsoft marketing department to set the default setting for Option Strict and Option Explicit to Off, B) include Microsoft.VisualBasic namespace. I shudder to think what would happen if the same approach was used to make C# backwards compatible with MSVC++.

I stopped upgrading with VB6 and Office2007 just to avoid the net conversion until the current research lines had played out. Now with win10 and vb.net and issues with connectivty it was time to move

TIL that type declarations - gone, structures .... great ... i ll just ......wth.

Public private friend ....why do in need all these in my friggin type declarations if i make that type declaration private?

So now I wannabe in MonoDev, but im having to relearn VB in VS express.

And my 'type declarations' have subroutines, oh that definitely is a performance improvement. lol.

The reason to program in basic is so your thoughts and your algorythm are fluid with the language. If all the time is spent trying to figure that language, all those nifty thoughts gravitate to figuring out the code.

Link to comment
Share on other sites

The reason to program in basic is so your thoughts and your algorythm are fluid with the language. If all the time is spent trying to figure that language, all those nifty thoughts gravitate to figuring out the code.

This is valid ONLY if English is your native language or you know it good enough. I know .Net developers in my country who know no more than 10 words in English. For them, C# with less keywords is less stressful. Then again, class names is a separate pain (up to a point that I have seen once a whole bunch of class wrappers that simply renamed English class names to make them more meaningful for coders).

Link to comment
Share on other sites

This is valid ONLY if English is your native language or you know it good enough. I know .Net developers in my country who know no more than 10 words in English. For them, C# with less keywords is less stressful. Then again, class names is a separate pain (up to a point that I have seen once a whole bunch of class wrappers that simply renamed English class names to make them more meaningful for coders).

Well knowing the King's English does not immunize you from geek speak. For example Class, Single, Double, Char (which is not a char, but a number that represents that char), ByVal, Option Strict, Const, Dim, Sub, Redim, Static, Friend, Gosub, ^ (not typically meanings in Basic that a non-computer savvy person might have). And of course if your friends (not the FRIENDs we just invented on Virtual Studio) read reddit then TIL WT_ NSF_, AMA, IAMA, and DAE mean. If they are having trouble with "If Then Else End If" or "For Next" then I would tell them they have a whole lot of woe in store for them in RL, 'cause the geeky-speaky urban lexicon is quickly becoming the lingua franca of a whole generation of folks who have early-onset device-induced social dysfunction.

Link to comment
Share on other sites

I do not understand the aims of the author (PB666) of the original post. Seems like you have a project and want to learn about half a dozen languages to finish it. Meanwhile, it also seems that you want to optimize math operations on you own, which almost nobody does (for good reasons) using languages that you just learned (terrible idea if I ever heard one).

So the first thing to learn is: 'Premature optimization is the root of all evil'.

Pick one language and implement what you need. If the stuff is complex do yourself a favor and pick a comfortable language (C#, Python). Then run it. If it turns that performance is good you are done. If not, _profile_ it and attack the slowest part. I can tell you beforehand that the slowest part is not going to be calculating square roots, which is good because there is very, very little to gain there. Then go on and learn how to make that piece that is slow better. Maybe you will be able to do so still within your language of choice, maybe you will need pure C. But you will be coding something like 100 lines of C tops.

Nowadays you can push pieces of for example python code to speeds equal or even faster than C if you need to, often without writing a line of C. You just need to find which pieces.

Link to comment
Share on other sites

I wouldn't automatically label every bit of code with goto statement as 'terrible'. Goto is simply jmp in assembly and under the hood many 'proper' things in a source code are compiled into jump opcode. If you view the native code all your for (...), while (...), do { ... } while (...), switch (...) case ... case ...}, break and continue are basically goto statement.

I don't use goto simply because someone told me that 'it's terrible', but if you stop and think about it, sometimes it CAN be justified.

For example:

Let's say you've got this algorithm (A, B, C, D are some operations ):

https://i.imgur.com/iUNJI6e.png

Let's code it the proper way:


char bf1, bf2, bf3;
if (a)
{
A;
bf1 = 1;
}
else
bf1 = 0;

bf2 = 0;
do
{
do
{
if (bf3 ||
bf3 = 1;
else
bf3 = 0;
if (bf3 || bf2)
B;
if (bf3 || bf1 || bf2)
{
C;
bf1 = 0;
bf2 = 1;
}
if (!bf3)
{
if (!c)
{
D;
bf3 = 1;
}
else
{
bf3 = 0;
bf2 = 0;
}
}
}
while (bf3);
}
while (bf2);

E;

Now, some heresy with goto:


if (a)
{
A;
goto L3;
}
L1:
if (
{
L2:
B;
L3:
C;
goto L1;
}
else if (!c)
{
D;
goto L2;
}
E;

So, which code is easier to understand and maintain?

Another example of heresy:

char a, b, c;

for (a = 0; a < 10; ++a)
{
for (b = 0; b < a; ++
{
if (!c)
goto Leave;
}
for (b = 10; b < 15; ++
{
d ();
}
}

Leave:
e ();

and the 'proper' way with a flag:


char a, b, c, f1;

f1 = 1;
for (a = 0; a < 10 && f1; ++a)
{
for (b = 0; b < a && f1; ++
{
if (!c)
f1 = 0;
}
if (f1)
{
for (b = 10; b < 15; ++
{
d ();
}
}
}

Here you add a condition check on each iteration simply because using 'goto' is TERRIBLE.

Goto, like any other language element (in any programming language) is a tool. How you use it determines if you are good or bad, but I doubt the mere fact of using it automatically labels you 'terrible programmer'.

None of the above is the best way IMHO. I think coding the different options as functions, and calling them is the best option. The "loop" can end up in some kind of recursion. But the tail-call optimization of any C compiler will fix growing stack issues.

Link to comment
Share on other sites

None of the above is the best way IMHO. I think coding the different options as functions, and calling them is the best option. The "loop" can end up in some kind of recursion. But the tail-call optimization of any C compiler will fix growing stack issues.

It is nearly always good to pack things inside a function, but those examples were snippets that you can drop anywhere.

But, talking about performance critical code, I had once a simulation where I dropped >70% runtime by doing parameter check before (outside) my refresh function. Construction of the stack frame for the function was a waste of cycles if often it exited straight away because the set of parameters did not trigger the full refresh procedure.

So by all means, keep it tidy, but you may have to make it messy later on.

On the example of the A B C D E operations, another optimization is running the code with example data and based on number of calls placing the most common case first, to bypass 'ifs' often.

Edited by glacierre
Link to comment
Share on other sites

(...)

I don't use goto simply because someone told me that 'it's terrible', but if you stop and think about it, sometimes it CAN be justified.

For example:

Let's say you've got this algorithm (A, B, C, D are some operations ):

https://i.imgur.com/iUNJI6e.png

(...)

Goto, like any other language element (in any programming language) is a tool. How you use it determines if you are good or bad, but I doubt the mere fact of using it automatically labels you 'terrible programmer'.

I agree on your assessment of goto. In addition, my father, who had a weak spot for Basic (and he started his career in machinecode, not that sissy assembler stuff with labels that the 'ung ones now think is hardcore), always told me: “there are no bad languages, just bad programmersâ€Ââ€â€I'm sure that can be extended to statements!

Having said that, my first hunch looking at that diagram would be to stop and figure out how to streamline that monster.

Link to comment
Share on other sites

I do not understand the aims of the author (PB666) of the original post. Seems like you have a project and want to learn about half a dozen languages to finish it. Meanwhile, it also seems that you want to optimize math operations on you own, which almost nobody does (for good reasons) using languages that you just learned (terrible idea if I ever heard one).

So the first thing to learn is: 'Premature optimization is the root of all evil'.

Pick one language and implement what you need. If the stuff is complex do yourself a favor and pick a comfortable language (C#, Python). Then run it. If it turns that performance is good you are done. If not, _profile_ it and attack the slowest part. I can tell you beforehand that the slowest part is not going to be calculating square roots, which is good because there is very, very little to gain there. Then go on and learn how to make that piece that is slow better. Maybe you will be able to do so still within your language of choice, maybe you will need pure C. But you will be coding something like 100 lines of C tops.

This is a joke, right. My structures and consts all by themselves will be 100 lines. I have no choice but to move to C, VB is too slow, win10 makes it even slower. The assunption here is that you know what the density of Sqrt operations are in the code, as alredy describe above, i might get away with a substantila performabce tweeking the compiler, more if I use the inverse float square root function. Read the post, if you use an older c++ math.sqrt it can take 400 cycles, they managed to get that down to 3 specifying the inst set and using x * rsqrtss x. lol, 100 fold difference........

To get a feel take four charcter types, lets make them bytes , now make random sized that accumulate to 150 gb. And you dont know apriori how big in terms of numer of byte strings the set is. Now i am going to give you 6gb divided in 22 units,

These units are not identical to the first, not even in pieces, they vary in both string sequnce, multiple type can exist at a position in a string and their identities are known to shrink and swell. Your job is to find which unit every piece in the first set best fits into the second and then fit those byte strings into a long chain. In addition there is degenerecy, some pieces best fit in several places, and other pieces may not fit at all, so these have to be set aside. the inclusion and set asides are determined mathematically using log functions and square roots.

All of this can be done with known algorythm on mainframes, no problem. But now I am going to give you data that is not in the public domain, its coded differently from the first set, so the public alignment routines will give false optimals. So now you need to mimic that process on the mainframe except on the PC and with a much high frequency of variation in particularly convinience defined strings without cherry picking. IOW the boundaries of the variation should be self-defining.

Link to comment
Share on other sites

https://i.imgur.com/iUNJI6e.png

Now, some heresy with goto:

Yeah... Now implement the garbage you've written vs code that Jouni wrote and time it. Watch your code thrash around miserably because it ended up being unoptimized.

Sure, if you can't write the "proper way" properly, everything looks like it needs goto. But that's, once again, just your inability to write good code.

Goto is simply jmp in assembly and under the hood many 'proper' things in a source code are compiled into jump opcode.

And many are not. Some loops are going to be completely unrolled by an optimizer. Many bits of code will be moved around, so that the jumps have a completely different structure.

Unless you are actually coding in assembly, the code you are writing and machine code you will get are going to be pretty different. If you understand how optimizer works, you can predict that and help it out to get faster code. If you don't, your best bet is to not interfere with it. And getting "clever" with goto statements is exactly that sort of interference.

So do yourself a favor, and learn to write proper code. That's what optimizer expects, and that's what will have better performance despite your misconceptions. Of course, the main reason to write proper code is the fact that it's easier to read and debug. Again, see Jouni's solution. The fact that it will also run better is just a bonus in this case.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...