Many of netcurl 6.1 components have been released this week. Dependencies that “we” use to make netcurl more efficient. Many things has been imported from 6.0, but instead of copy local code from the old project (with pride) all code has been rewritten from scratch. Having old code in a new project could probably be devastating.
So. The wheel has been built again?
No. Yes. Maybe. Nah. We’re just making the wheel better!
This “wheel” is a self contained project. As usual. I’ve said it before. The project is being built to be a simplified component between an idiot developer and the internet communcation layers. For example, by building new solutions a developer normally needs to reinitialize the tools he/she (hen) is using. If the developer is using curl, it has to be configured from scratch. And there are probably other solutons that do similar stuff to this module. But I don’t want it.
The primary reason of this is the fact that I need more. I need a parser that fixes all communications by itself. Without my interference. I say: “Bring me this website parsed” and the solution should bring it to me, regardless if it is SOAP, rest, XML, rss or a socket connection. Netcurl is the part that should decide this for me, and I should be ready on the other side to have it. And this is actually how curl itself works.
Compared to the military, this application is not supposed to ask “HOW HIGH?!” when someone asks “Jump!”. It should know how high it should jump before developers has even measured that value.
However, if the site speaks SOAP, you need an extra component. And probably, SoapClient resides in such solutions. Netcurl should handle this out of the box. Or xml. Or RSS. So here we are again.
netcurl 6.1 is being rebuilt, and you’d probably understand why if you look at the old codebase. It was supposed to transform into a PSR4 solution. But it failed. So there’s a lot of reasons of why this wheel is being rebuilt. And this time, I think I got it faster. For example – netcurl 6.0 (or TorneLIB 6.0) did not support anything but regular curl communications. If you wanted to fetch five different pages, you had to run the curl-request five times. netcurl 6.1 takes advantage of curl_multi calls, so you can push five different urls into same session and get them back from different handlers. Like it should be.
So what does back to basics mean?
Well. netcurl, or the prior name TorneLIB was actually built with war in mind. It was supposed to datascrape proxy lists, collect the proxies and then register them for use in DNSBL. In short, blacklist trolling haters on the internet. Then, work came in my way and I found out that netcurl had a perfect role in ecommerce project we did build (yes, we). As it supported both rest and soap in an environment where the library itself chose an available driver, it could handle much more than just warfare applications.
Time passed by and for a few weeks ago, corona said hellow. At this moment, most of us stayed home, so when the “real work” ended I could with no interruptions take care of netcurl. And here we are, almost done. And this time, the library should be able to handle it better than it used to.