[was not sure of whether I should post this here, but it's kinda WTF-ey so eh what the hell.]
So, I'm a diehard fan of the Unity game engine. Absolutely love it.
But sometimes the Unity team just releases the most bizarre shit.
Recently, they released 5.1 Release Candidate 1. This release candidate includes their new UNET networking API, which design-wise looks to be a massive improvement over the pile of crap that was Unity Networking previously.
I develop a plugin for Unity which implements the framework for multiplayer voice chat called DFVoice. So I decided to sit down and come up with an example of how to use DFVoice with UNET.
About ten minutes later I go to test. Sound is jittery as fuck. "OK" I think to myself. "Must have done something wrong, let's see what it's receiving..."
All packets in DFVoice have an index attached to them. So I log the sent and received message indices.
I notice something peculiar with the received messages.
Received: 1 Received: 0 Received: 5 Received: 4 Received: 3 Received: 2 Received: 10 Received: 9 [...]
Eh wha? That's not right. No wonder it sounded jittery - DFVoice throws out any packet which is out of order (basically, unreliable sequenced) - as well as informing the codec of the lost packets (for instance, Speex and Opus do some sort of error correction black magic for lost data).
So it's throwing away 90% of the voice packets.
After about half an hour of pulling my hair out, I finally tried this:
public void Start()
{
if( NetworkServer.active )
{
for( int i = 0; i < 20; i++ )
{
RpcTest( i ); // Unity rewrites the IL so that this sends an RPC instead of directly calling the function
}
}
}
[ClientRPC( channel = Channels.DefaultReliable )]
public void RpcTest( int index )
{
Debug.Log( index );
}
Lo and behold.
19 18 17 16 15 ... 0
So, the messages are buffered and then dispatched... in reverse order, LIFO instead of FIFO.
I just... how do you mess this up? And let it get to Release Candidate?
Am I missing something? Is this an easy mistake to make or something?