Rewriting Java classfiles
-
https://github.com/Bukkit/CraftBukkit/commit/e6cd8c017fe6991600c6af1955ace9d517a7080f
this.playerView = Collections.unmodifiableList(Lists.transform(playerList.players, new Function<EntityPlayer, CraftPlayer>() { @Override public CraftPlayer apply(EntityPlayer player) { return player.getBukkitEntity(); } }));
Of note is the maps.yml file. This contains a series of instructions to OverMapper for how to rewrite the classfiles.
https://github.com/Bukkit/CraftBukkit/blob/e6cd8c017fe6991600c6af1955ace9d517a7080f/maps.yml#L20
This tool was originally created because the
int
return types of all those methods needed to change todoubles
, but "we want to do this in a way so that existing plugins still work", which resulted in the creation of OverMapper, which reads that maps.yml file.Meh... now that I write all this out, it really feels like not so much of a WTF.
Still, it's pretty amazing that they managed to break source code compatibility but not binary compatibility.
-
Still, it's pretty amazing that they managed to break source code compatibility but not binary compatibility.
That probably does qualify as a WTF, but it's one I'd be proud of having done actually.
-
-
Meh... now that I write all this out, it really feels like not so much of a WTF.
Rewriting object code to maintain binary compatibility instead of renaming all existing methods that needed a different return type and adding the original methods as stubs? I would call that a WTF.
-
and adding the original methods as stubs
They are stubs that forward to the new method.
/** * This method exists for legacy reasons to provide backwards * compatibility. It will not exist at runtime and should not be used * under any circumstances. */ @Deprecated public void _INVALID_setHealth(int health) { setHealth(health); }
-
They are stubs that forward to the new method.
Upon better inspection I see what they're doing, that is actually quite clever. if I see it correctly they're creating a method overloaded on return type, but since that isn't possible in java, but possible in the bytecode because the return type is part of the method signature, they're rewriting the bytecode?
You could've made it a bit more clear.
Still think it would've been better to just use differently named methods
-
player.realSetHealth(0);
-
-
How is that a PHP-style fubar?
-
I'm guessing mysql_escape_string vs mysql_real_escape_string
-
How is that a PHP-style fubar?
I'm guessing mysql_escape_string vs mysql_real_escape_string
Yeah, sorry, I think we already beat that one into the ground, but it was just too tempting.
-
If it is used for the setters it really is stupid imho. Why not just use a regular overload?
@Dependent ecated public void setHealth(int health) { setHealth((double)health); } public void setHealth(double health) { // ... }
With getters it is an inconvenience. That's the cost of changing an API while retaining binary compatibility.
You basically have the following (standard) choises:- Creating new methods
- Create a new class, possibly making the old class a facade (this is why Java has
Vector
andArrayList
, orHashTable
andHashMap
) - Break binary compatibility, requiring all code that depend on it to be recompiled
I can understand why 3 was unfeasible.
The option that was chosen is imho a worse option than 1 and 2. You get the advantage of maintaining binary compatibility while breaking source compatibility, but:- Complicate your build process
- Open yourself up to weird and hard to detect bugs through your object code rewriter
- Increase your chance of encountering JVM bugs, since the situation you create will be very rare
- Actively discourage plugin authors creating a new plugin version, as they will have to change (possibly lots) of their code if they want to recompile
-
With getters it is an inconvenience. That's the cost of changing an API while retaining binary compatibility.
[...]
The option that was chosen is imho a worse option than 1 and 2.
The problem is that the JVM allows methods to differ by just return type (it's part of the spec, and always has been) but the Java language doesn't.
@dtech said:You get the advantage of maintaining binary compatibility while breaking source compatibility, but:
- Complicate your build process
- Open yourself up to weird and hard to detect bugs through your object code rewriter
- Increase your chance of encountering JVM bugs, since the situation you create will be very rare
- Actively discourage plugin authors creating a new plugin version, as they will have to change (possibly lots) of their code if they want to recompile
Complex build processes are relatively common with Java systems; mavenized builds are good at recording the complexity and reproducing it exactly.The second and third points are fair.
The fourth one is silly. Either you're supporting older code — a good objective — or you're encouraging third parties to keep up to date — a laudable objective that can be very irritating to downstream developers. As a developer, having the rug pulled out from under your feet is a good reason to switch to a different system entirely.
-
if I see it correctly they're creating a method overloaded on return type, but since that isn't possible in java, but possible in the bytecode
The problem is that the JVM allows methods to differ by just return type (it's part of the spec, and always has been)
I already said that.
The fourth one is silly. Either you're supporting older code — a good objective — or you're encouraging third parties to keep up to date — a laudable objective that can be very irritating to downstream developers. As a developer, having the rug pulled out from under your feet is a good reason to switch to a different system entirely.
I don't understand your point. The (old) plugins won't compile against the new code, even though precompiled binaries will continue to work. As such it provides less backwards compatibility than a facade or new methods.
Just changing the signatures also breaks source compatibility, but in my 4th point I argue that this solution is actually worse. Now old binaries will continue to work while changes need to be made to make new releases/binaries. Thus there is a negative incentive to create new releases.
-
Just changing the signatures also breaks source compatibility, but in my 4th point I argue that this solution is actually worse. Now old binaries will continue to work while changes need to be made to make new releases/binaries. Thus there is a negative incentive to create new releases.
You're entitled to your opinion. My point was that though from your perspective — upstream — it is reasonable to force downstream to upgrade to suit your release cycles, from the perspective of the clients of your code — downstream — it's damn annoying to have to update your code just because someone screwed up their initial types. I've seen people switch libraries/products for less.