Why my friend had to retake his C++ course with a different professor
-
Although I've heard of VATS being useful for a melee build as well, but who doesn't want to be a ninja?
>implying you can only be melee or ninja, completely ignoring the possibility of frontline rifleman
-
Oh, and more reasons why I hate VATS:
-
Enemy's HP bar shows the damage dealt with the next shot you'd mark on him, not the ones you've already marked - meaning that to see how much damage you will deal to the enemy with current setup, you must undo your last mark, look at the bar, and then mark him again. Which takes time, which in turn may result in 1) getting hit, and 2) may change the accuracy of your shots or let the target take cover.
-
Mouse controls suck.
-
-
Alright, so carry on enjoying Fallout incorrectly, I guess. I know I will...
-
TIL you can enjoy video games incorrectly. I'm not exactly sure what it means or what the consequences are, but it looks like it's bad and I should feel bad.
I wonder what other things you can enjoy incorrectly...
...on the second thought, no, not thinking about it. I don't want another million posts thread featuring animals in bras.
-
Alright, so carry on enjoying Fallout incorrectly, I guess. I know I will...
Pfft. I'm enjoying it fine without using VATS TYVM.
-
Yeah, the new VATS sucks. I'm actually going back to good old F2 days.
-
It's not even that it sucks - it's just I didn't see any benefit.
I'm not running a stealth build, I have no desire to sneak around everywhere. VATS just slowed me down.
-
Well, if VATS worked as previously - stopped time - it could have some usages. Now you just have to battle with shitty mouse control hoping you have enough time to do it. Is it worth it? Not really IMHO. It's actually better to get all resistance perks (including rooted), get into PA, and just stay in one place and pour lead all around. To hell with all this stealth shit - when a deathclaw can't take more than 10% of your hp, you know you're doing it right.
-
get into PA, and just stay in one place and pour lead all around.
Exactly this.
-
Haven't actually played stealth build
Headshots with a suppressed weapon (given that they're not aware of you for that delicious stealth bonus) will drop almost everything on the first pop. And won't alert his friends. The AI is still very much on the level of “Oh, my buddy's head has just exploded. Let's go over and see if he is OK…”
-
Headshots with a suppressed weapon (given that they're not aware of you for that delicious stealth bonus) will drop almost everything on the first pop.
Even within first 20 levels?
-
Even within first 20 levels?
If you get the right weapon. You also need to be very keen on taking stuff out from a distance to start with. It helps that low-level enemies mostly can't shoot for shit.
Blowing up a car next to that early deathclaw helped.
-
If you get the right weapon.
By a right weapon, you mean a bugged shotgun with bleeding bonus?You also need to be very keen on taking stuff out from a distance to start with.
Not always possible.
-
I think a lot of people seem to be missing the point that it's a Fallout game, so there's umpteen possible viable playstyles. Frankly if you can't find one single way to achieve your goals consistently (including talking the enemy to death), then you are playing it wrong...
-
Yeah, when I finally run out of side quests and bother to finish the main quest line, I'm thinking of starting again and playing it differently. Different perks, different weapons etc.
-
...or pick different allies, use some new companions, etc... Damn, being married is drastically limiting my FO4 time... :``(
-
...or pick different allies
Yeah but you don't really need to pick sides until towards the end of the main quests, which is where I got to before parking the main quest line to go and get some more side quests in while I'm still friendly with all factions.
use some new companions
True. I've only used either Dogmeat or Danse.
-
It's not even that it sucks - it's just I didn't see any benefit.
It makes the game accessible to RPG players who don't like/aren't good at first-person shooters.
-
@loopback0 said:
It's not even that it sucks - it's just I didn't see any benefit.
It makes the game accessible to RPG players who don't like/aren't good at first-person shooters.
Ah - I've not really played RPG so didn't think of that
-
I don't give a fuck about this thread, but just wanted to note that Dipjorge has broken HOME and END keys again. They just go to the top or bottom of
pageJeff's dickhole, and not to top or bottom of thread.
-
norepro
-
And soon we will have Java rewritten in Minecraft.
-
Don't you just love the situatuins where someone has zero experience in some ragard but posts like an expert? I'm drunk now so I'll allow myslef this - blaky - your a piece of shit idiot and you know shit. go kill yourself,m because you are shit. Or get a clue and THEN post.
Literally LOL.
-
I would normally be embarrassed by drunk-posting, but then I realized all I wrote is true.
-
How did I miss that post?!
-
1) On bad C++ teaching: I've had the same problem with my class's introduction to C++, which was C with classes and led to me having to completely re-learn C++ from a book. And not even that good a book.
Dude even showed us operator overloading, with an example of
operator+
that actually modified its left operand likeoperator+=
.2) On binary files: It has occurred to me that C++ actually hates binary files. Its support for them is awkwardly tacked-on to the point that even streams opened in binary mode deal in characters instead of bytes, and they can also be screwed with by locale settings. Frankly, even in C++ you're better off using the <cstdio> streams when it comes to binary files. Oh, and it's equally annoying that C++ provides no bridging classes to those, despite supporting them as the standard mandates.
At least .Net makes much more sense on this: You've got
Stream
s that work on bytes,Encoding
s that turn characters into bytes, andTextWriter
s that handle formatting stuff into strings. Simply putting these together (with aStreamWriter
) gives you something that formats stuff into strings, encodes them into bytes, and writes them wherever you want (file, console output, memory, network...)
-
The conversion of floating point is all right, it's has differences when you want to pad a string with zeroes to the left, and the snprintf version of the function behaves different when there isn't enough space on the buffer between VC++ and GNU C.
VC++ doesn't actually have a
snprintf
function as the standard defines it. It only has two separate, custom functions that predate C99 but probably not POSIX:_snprintf
and_scprintf
(and I'm not talking about the_s
versions here).There is however, one thing regarding
*printf
that VC++ handles differently from the standard where I think VC++ is right and the standard is dumb: The way they handle wide-character strings. Mostly because the standard doesn't provide any way to say "same character width as the current function and format string" (whereas VC++ repurposes the default%s
as this, while using a specific format specifier for "always narrow" strings).
-
"same character width as the current function and format string"
And when they differ from each other?
There's the sane way of doing things — say that formatting is done with characters always and that binary data is just binary data and uses bytes and not characters — and then there's the mess that C is in. (And C++. And all the various vendor extensions to both.) This requires being very clear that the outside world is in possibly multiple encodings, so requires subtle controls in the right places, and the abstractions involved will leak.
If you've got standard library code that doesn't do encodings right then it should at least pass the bytes around right since then it is possible to put something on top that is less awful.
-
2) On binary files: It has occurred to me that C++ actually hates binary files.
You can use any unformatted output for that, but it's true - stdlib tends to go with the portability in mind.
-
1) On bad C++ teaching: I've had the same problem with my class's introduction to C++, which was C with classes and led to me having to completely re-learn C++ from a book. And not even that good a book.
Dude even showed us operator overloading, with an example of
operator+
that actually modified its left operand likeoperator+=
.2) On binary files: It has occurred to me that C++ actually hates binary files. Its support for them is awkwardly tacked-on to the point that even streams opened in binary mode deal in characters instead of bytes, and they can also be screwed with by locale settings. Frankly, even in C++ you're better off using the <cstdio> streams when it comes to binary files. Oh, and it's equally annoying that C++ provides no bridging classes to those, despite supporting them as the standard mandates.
At least .Net makes much more sense on this: You've got
Stream
s that work on bytes,Encoding
s that turn characters into bytes, andTextWriter
s that handle formatting stuff into strings. Simply putting these together (with aStreamWriter
) gives you something that formats stuff into strings, encodes them into bytes, and writes them wherever you want (file, console output, memory, network...)
Have you considered all the gobble you have to do in order to deal with files properly in a platform-independent way?Let's start by ignoring files that aren't stream-oriented. So let's stay on UNIX-likes and DOS/Windows and Mac. We'll come back to not-stream-oriented files in a moment. It causes ... issues.
So a file is a seekable stream of bytes, right? Well, a text file isn't, because it contains line endings, and while they are consistent on any one platform(1), they aren't consistent between platforms. On Mac, traditionally, the representation of a line-end is a '\x0d', that is, an ASCII CR. On UNIX-likes, it's a '\x0a', an LF. On DOS/Windows(1) it's a CR then an LF. This difference in size means that on text files, fseek()ing to an absolute location that wasn't obtained by ftell() is UB, unless that location is 0 relative to begin or end.
Binary stream-oriented files lack this translation because by definition they have no concept of lines, but (TIL that) they are still affected by gobble in the STL and/or C-stdlib. (Arguably, that's part of the "gobble you have to do in order to deal with files properly" except that it hardly seems like it is being done properly.)
Now, up above, I mentioned the possibility of files that are not stream-oriented. The particular case I'm thinking of is files as implemented on the Michigan Terminal System, an alternative OS for IBM System/3xx mainframes. It's a bit long in the tooth now - I last touched it in 1986 on a 3081D - but its files were line-oriented, not stream-oriented. That is, every file had an internal structure that allowed you to read and write lines rather than being just an array of bytes. The lines were numbered as part of that structure, and the read-a-line system call would allow you to specify by number which line you wanted to read.
Now try to implement stream-oriented file I/O like <iostreams> or even <stdio.h> over that... (In particular, consider read-a-char-seek-back-write-new-version as an operation, where you replace '\n' by something else.)
EDIT: Left something off:
(1) When I say DOS/Windows, I do not mean "Cygwin on Win32". That does (or did the last time I looked at it in that sort of detail) some things that I consider monumentally stupid, including showing me ALL the CRs in my Windows text files, as if its stdlib doesn't actually do anything different on Windows than it does on *NIX.
-
You're right, .Net's model won't support systems that don't support stream-oriented files (since it treats all files as stream-oriented binary files, and implements the (configurable) line-ending translation in
TextWriter
). And the way it "solves" the "seeking in text files" problem is by simply by not supporting seeking at all inTextReader
/TextWriter
.So I see it's not without its limitations, but it still makes more sense than C++ having only text-based streams with clumsy binary support (such as some locale stuff still affecting streams opened in binary mode; I do not remember any such gotchas linked to
fopen("wb")+fwrite()
)
-
You're right, .Net's model won't support systems that don't support stream-oriented files (since it treats all files as stream-oriented binary files, and implements the (configurable) line-ending translation in
TextWriter
). And the way it "solves" the "seeking in text files" problem is by simply by not supporting seeking at all inTextReader
/TextWriter
.So I see it's not without its limitations, but it still makes more sense than C++ having only text-based streams with clumsy binary support (such as some locale stuff still affecting streams opened in binary mode; I do not remember any such gotchas linked to
fopen("wb")+fwrite()
)
Even that bit of C nostalgia-creation would be problematic on MTS, because even binary files were line oriented... (And yes, that's every bit as bizarre as it sounds.)
-
Well, a text file isn't, because it contains line endings, and while they are consistent on any one platform(1), they aren't consistent between platforms.
Now I know for sure you're not quite certain what's going on, as you seem to be treating the stdio API as the fundamental one. Given that, in the rest of what you say, the truths are only there by accident and you've got howlers mixed in that make the whole misleading.
Yes, I'm assuming that all systems map files as an undifferentiated bag of bytes. Because that's what they all do. Now.
-
-
Frankly, even in C++ you're better off using the <cstdio> streams when it comes to binary files.
I just operate under the assumption that
<iostream>
isn't even a thing. Works for me!
-
including showing me ALL the CRs in my Windows text files, as if its stdlib doesn't actually do anything different on Windows than it does on *NIX.
I wouldn't be surprised. Linux system calls and libc don't have a distinction between opening a file in text or binary mode. AFAIK, at least, but I would be very surprised if someone had a counterexample.
-
Well, a text file isn't, because it contains line endings, and while they are consistent on any one platform(1), they aren't consistent between platforms.
This is the kind of thing that makes naive ports of patch totally useless on Windows. :<xq>(
-
@Steve_The_Cynic said:
Well, a text file isn't, because it contains line endings, and while they are consistent on any one platform(1), they aren't consistent between platforms.
Now I know for sure you're not quite certain what's going on, as you seem to be treating the stdio API as the fundamental one. Given that, in the rest of what you say, the truths are only there by accident and you've got howlers mixed in that make the whole misleading.
Yes, I'm assuming that all systems map files as an undifferentiated bag of bytes. Because that's what they all do. Now.
No, I'm not treating stdio as the fundamental behaviour, because as I proceeded to explain, it isn't. It maps the local implementation of files (by the OS) onto an undifferentiated stream of bytes. However, that mapping can be hairy, to a lesser or greater extent, especially once you stray outside the world of DOS/Windows/UNIX/MacOS.So, yes, all implementations of stdio do that mapping, but they have to do varying amounts of work, and sometimes they get it wrong, as in the behaviour I observed of Cygwin.
-
@Steve_The_Cynic said:
including showing me ALL the CRs in my Windows text files, as if its stdlib doesn't actually do anything different on Windows than it does on *NIX.
I wouldn't be surprised. Linux system calls and libc don't have a distinction between opening a file in text or binary mode. AFAIK, at least, but I would be very surprised if someone had a counterexample.
That wins a big fat "so fucking what" from me. Cygwin is a layer for running on Windows, not on Linux, and so it should obey the Windows text file conventions.
-
@Steve_The_Cynic said:
Well, a text file isn't, because it contains line endings, and while they are consistent on any one platform(1), they aren't consistent between platforms.
This is the kind of thing that makes naive ports of patch totally useless on Windows. :<xq>(
What, you mean ones that open text files in binary mode?EDIT: Actually, there's a serious consideration: you can't patch LF-ish text files in-place on Windows if you use a proper CRLF-ish stdio. Not that you should be patching literally in-place anyway...
Oh, and Discodevs: please make the pink reply-consolidation advice box fuck off forever.
-
However, that mapping can be hairy
And stdio doesn't really do a good job of it in the first place. Its only virtue is its relative ubiquity.
-
I think Cygwin devs did this for two reasons:
- Assuming most people working with Cygwin will work on
UNIX-line-terminated files anyway (stuff from *n?x servers, FTP etc.) - Application Compatibility, the same thing that attaches anchors to
Microsoft. Since POSIX makes little to no difference between opening
in binary or text mode, I bet a lot of *n?x programs (i.e., the kind
of programs Cygwin is supposed to make work) get it wrong; therefore,
Cygwin needed to be bug-compatible and mimics the *n?x behavior for
this reason.
ETA: Remember, in Windows
open()
is not a system call. It's not even a Win32 API function. The system call isNtCreateFile()
, the Win32 API function isCreateFile()
; andfopen()
and_open()
are actually part of Microsoft Visual C++'s C Run-Time Library (alias MSVCRT.DLL in good ol' Visual 6), and so is the text/binary distinction in Windows:CreateFile()
doesn't know, or care, what text mode is: It's purely a C concept, so it's implemented in the CRT. Cygwin probably uses CreateFile() directly, or always calls the CRT functions in binary mode.Edit2: It's quite funny that the platform for which text/binary mode is an OS-level concept makes no difference between modes, while the platform for which the difference is relevant doesn't implement it at OS level.
- Assuming most people working with Cygwin will work on
-
even streams opened in binary mode deal in characters instead of bytes
Please educate me as for all I know a C/C++
char
is nothing more than a byte.
-
The difference is more apparent in wide streams (streams of wchar_t). It's all
basic_stream<T>::char_type
, including the simpleput
method; there's simply nothing about non-character/non-string data in the interface.
-
Then my question becomes: why do you use
wchar_t
, or streams thereof, for binary files? For me it seems like a data type specifically designed for textual data. (Sorry if it's really an annoying or noob question, I only have superficial knowledge of C++.)
-
Please educate me as for all I know a C/C++ char is nothing more than a byte.
Ok:
3.9.1.1
Objects declared as characters (char) shall be large enough to store any member of the implementation’s basic
character set. If a character from this set is stored in a character object, the integral value of that character
object is equal to the value of the single character literal form of that character. It is implementation-defined
whether a char object can hold negative values. Characters can be explicitly declared unsigned or signed.
Plain char, signed char, and unsigned char are three distinct types, collectively called narrow character
types. A char, a signed char, and an unsigned char occupy the same amount of storage and have the same
alignment requirements (3.11); that is, they have the same object representation. For narrow character types,
all bits of the object representation participate in the value representation. For unsigned narrow character
types, all possible bit patterns of the value representation represent numbers. These requirements do not
hold for other types. In any particular implementation, a plain char object can take on either the same
values as a signed char or an unsigned char; which one is implementation-defined. For each value i of
type unsigned char in the range 0 to 255 inclusive, there exists a value j of type char such that the result
of an integral conversion (4.7) from i to char is j, and the result of an integral conversion from j to unsigned
char is i.
-
That, and sizes of all other types are defined as multiples of char size. Also, char is excused from strict aliasing rule. Moreover, UTF-8 is inherently incompatible with the above definition, but no one gives a fuck.
-
The nice thing of UTF-8 in an ASCII-oriented environment like C is that everything keeps working. Only truncation and character-oriented operations on non-ASCII characters are a problem.
Notably, text parsing and tokenizing operations are typically safe as long as your syntax characters are in ASCII.
-
The nice thing of UTF-8 in an ASCII-oriented environment like C is that everything keeps working
Oh yeah, especially strlen().
-