When you write source code, you access the library through an API. Once the code is compiled, your application accesses the binary data in the library through the ABI. The ABI defines the structures and methods that your compiled application will use to access the external library (just like the API did), only on a lower level
An oversimplified summary:
API: "Here are all the functions you may call."
ABI: "This is how to call a function."
Your API defines the order in which you pass arguments to a function. Your ABI defines the mechanics of how these arguments are passed (registers, stack, etc.)
Linux and Windows use different ABIs, so a Windows program won't know how to access a library compiled for Linux.
If the ABI changes but the API does not, then the old and new library versions are sometimes called "source compatible"
Keeping an ABI stable means not changing function interfaces (return type and number, types, and order of arguments), definitions of data types or data structures, defined constants, etc. New functions and data types can be added, but existing ones must stay the same. If, for instance, your library uses 32-bit integers to indicate the offset of a function and you switch to 64-bit integers, then already-compiled code that uses that library will not be accessing that field (or any following it) correctly
Also, if you have an 64-bit OS which can execute 32-bit binaries, you will have different ABIs for 32- and 64-bit code.
The technology we have today, for our current level of living, is like a massive exercise in crowd-sourcing.
One of the most important ones is GPS, before that we had Celestial Navigation. How we were able to create a corpus soo large, yet accurate trough stars across the globe. More context about the thought here :