While working at REDACTED, I reported the following issue.
The Back end team routinely changes the services they offer to the Front end team. Changes occur both at service level (some services are added, others are removed) and data level (some attributes are added, others are removed). This is inevitable during development, but it doesn’t need to cause random bugs.
Currently, those changes get lost because the Swagger pages (which is the tool they use at REDACTED to publish HTTP endpoints) are published anew each time they want (which can be many times a day), and this tool does not show any differences with respect to a previous version.
Soon after detecting this problem, I wrote a small script to access the Swagger pages and generate the corresponding data interface (many thousands of lines of attributes with their names and types, readily usable from TypeScript). This allowed me to easily compare snapshots of the Back end services taken at two different times, and find out whether, where, and what changes occurred in between.
But that was just a partial solution to a process problem. In fact, not only those changes are still lost today (they are not using my script) but the Front end side still interfaces with the Back end side by means of a collection of classes translated from the Swagger pages into TypeScript: for no reason at all (cheap laborers?) they do it manually even though it’s a mechanical operation.
Notice that additional improvements would be just around the corner when this process will be automated. For example, if a Java program generated the TypeScript services at each new version, not only the changes would be easy to spot (with a simple text comparison tool), but we could also collectively push these changes to a repository as a merge request, giving the Front end team time to evaluate the impact of those changes and decide how to deal with them.