Creating a CLI in Python: It's just a Click away!

 Hey everyone!

I've finally managed to set some time aside to work on my Release 0.1, and crazy me, also decided to jump into making it my first Python project. I've dripped my foot into these waters before in the past, but many half-finished tutorials have really only gotten me used to the syntax of it, which is also more free than I'm used to from my work in C++. Therefore, this has turned into quite the change for me.

After doing a bit of research on how to approach the project, I found an article that I used as my starting point to use, which highlighted why to use the "Click" framework over "argparse", while also providing a useful example to get the ball rolling: Writing Python Command-Line Tools With Click

From here, I threw together a quick main() to accept an argument and, using the W3Schools Python RegEx reference page detailing its uses, I made a basic checker to see if a list of links is populated after comparing it to the Python RegEx I found for web URLs here.

My simple checker ended up looking like this:

At this point, I was able to pass a link to the function and check it against my regular expression, but I hadn't set up the file parser to dig into the link itself.

From here, I had to decide how I was going to proceed. Would I try to dive into anything that was passed to the function, or would I try to determine what was being passed and tackle it from there. I ended up going with a hybrid. I would use the regular expression to see if what was passed was a file, or an actual link. From there, I would call the appropriate function. One for reading the file for any links, and then one for reading the web page for any additional links found on it (after determining that the link itself was valid).

To begin, I wrote a function to check links, since it will be needed in both areas of the program. To accomplish this, the requests library was needed to grab the status code of the site in question without trying to open the link in the process. I then used this function to check against a basic 200 and 404 check, which to this point is enough to divide links into 3 categories.

This was the extent to which I was able to produce, so I'll continue this progress soon! In the meantime, it's time to contemplate my file parsing strategy.

Comments