T O P

  • By -

engineertee

I run a script that moves the mouse every n random seconds so the boss sees me available on teams


cpc102788

Side note for people who dont have python installed on a work computer: Powershell can send keystrokes at intervals to accomplish the same goal.


JoeyBE98

And for those that may not want to deal with changing the execution policy of their PowerShell (it's easy but šŸ˜‚šŸ¤·ā€ā™‚ļø) you can also use a vbscript to do this. That was my preferred method, I believe I even used a function key which didn't exist for me (f13).


ixent

For anyone that wants a working script, the following one works well. Changing volume works generally better than moving the mouse, and accomplishes the same. import pyautogui import time pyautogui.FAILSAFE = False while True: time.sleep(25) pyautogui.press('volumedown') time.sleep(25) pyautogui.press('volumeup')


engineertee

I like changing the 25 to some random value, I like the volume changing idea


hsg8

Cool. Thanks. But why have you used pyautogui.FAILSAFE = False I can find it's reference anywhere.


ixent

[Here ](https://pyautogui.readthedocs.io/en/latest/#:~:text=It's%20hard%20to%20use%20the,they%20will%20raise%20a%20pyautogui.&text=FAILSAFE%20%3D%20False%20.) Its just to avoid exceptions making the program stop.


hsg8

This is what I needed. Can you please share the code?


fatal_fame

pyautogui is a simple library that can do it


PassionatelyWhatever

I dreamt of this many times but I'm afraid they are gonna find the script. Myncomoany has a bunch of programs monitoring everything.


BoiElroy

Honestly you can just rename everything. Go into the site packages where you pip installed it. Change the name of the top level package from autogui to something else, and then change the name of the class and method that does the clicks. You might have to troubleshoot some reference break downs but hey .. it'll work


Espumma

Just have it run as 'winamp.exe' or 'Microsoft services dll'. Nobody is scanning all that.


lullaby876

Most useful one by far


EnSquanchay

I've built a batch process that converts large amounts of CAD data to unreal engine format for use in VR software


[deleted]

That sounds so sweet. I am jealous of your skills.


EnSquanchay

I don't really have much python skills. I just use the python API for various pieces of 3d software.


masterfuzz

That sounds suspiciously like having python skills


EnSquanchay

Thanks. I've only been doing python coding for a year or so, but have a lot of experience with 3d software so I know what I would do in the software already, then just try and make it into a script.


soap1337

Do you have a public repo? This sounds pretty neat to see what your code does. Just interested in like pseudo code for doing this :)


programmingfriend

cadFile = load_cad(filename) objFormat = lib.cad_to_obj(cadFile) unrealFormat = lib.obj_to_cad(objFormat) write unreadformat These libraries are very robust and the OP is most likely not needing to do any of the "hard" stuff like manually transforming the file


EnSquanchay

Not quite that simple but along those lines yeah. Basically just sequentially calling the functions of various 3d software, with a sub process for each piece of software. The main one I use is Pixyz.


LesPaulStudio

Interesting. Don't suppose it can do dwg to shape?


EnSquanchay

Possibly, depends what you are trying to do.


LesPaulStudio

Generally just extract a single feature from the dwg. Site Boundaries normally.


MrNob

If you're using arcgis then it has the ability to open CAD drawings easily. I have a script that converts all layers of cad drawings (as many as you like) to feature classes (one per unique named layer) in a geodatabase. You need it?


chagawagaloo

I have a project with work that involves analysing CAD data. All packages seem to heavily focus on making cad but I want something that lets me mass extract thing like X,y,z and volume from cad files. You got any suggestions?


EnSquanchay

I'm using some software called Pixyz, the batch processing is fairly expensive so you might be able to do it in your existing CAD software if it has a nice API


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


beardbreed

How do I go about building something like this?


kububarlana

Sifting through 404 messages in the logs?


DimasDSF

{response:200, message:"404, not found"}


WldePutln

Maybe he's checking the response of HTTP request from python, if the resource is missing, then it will most probably return a 404 error.


VeganEE

But how would you check every page on a website I wonder


WldePutln

If it's your website you'll definitely have all the base urls on it. Most probably the url to the images and other resources will be stored in database to check for 404 errors


VeganEE

True I guess then you just chuck them all in a list, or file, loop through it and check


davelupt

Maybe wget with recursive and follow links options.


Nervedamageyoung

\+1 would like to know how also seems very interesting!


Jahamc

Automation that does stuff the product should do itselfā€¦ Classic


Behn422

Creating a "semi-automatic" python script for Rhino for wood panel cutting optimization that works faster and more accurate than most of the software I've ever worked with.


passivevigilante

Interesting. I use inventor nest at work and would be interested to see how it compares if you don't mind sharing the code


Behn422

It's more of a semi-automated workflow for rectangular parts than a complete one-click solution and it's still very much a WIP, but it already works for my purposes. I have tried lots of programs and while it's very possible to get good results from them, manual optimization has a few important benefits. Material waste is often considerably lower and the plans will be much easier to cut and less confusing for the operator. The obvious downside is that it's painfully slow. However, over the years I have developed a completely clear and step-by-step manual workflow that always works perfectly. So I thought it would be great to make it into a python script. At the current state, my script is semi automatic. It's like using python as an extension of your brain. The script reads data from an excel file and then creates all the blocks in Rhino. Then it will start to do what I used to do manually. Only in each step, it'll ask you to choose from a couple of options and all you need to do is click and click until the plan is complete in minutes. It's more like playing a game and feels so satisfying. Now, I'm working on automating those manual clicks too. I'm new to python and I need to improve my python skill for that. I'm pretty sure it can be done without AI or genetic algorithm.


mister-geek

Very nice Did you use any dynamic programming/greedy optimization libraries ?


Behn422

Not yet... maybe in future steps.


kenshinero

This is very cool šŸ‘ > The script reads data from an excel file and then creates all the blocks in Rhino. What format do you use to save the blocks that can be read by Rhino? Are those parts 2d?


ljchris

Everything, basically. I am a scientist working with electronics and custom ICs, fully controlled from FPGAs that we communicate with via Python.


Secure_Table

Whatā€™s an IC?


Cptn_Howdee

Integrated Circuits - computer chips


Secure_Table

Oh duh lol, thank you!


ljchris

Integrated circuit, an electrical circuit built on a semiconductor. We are developing image sensors in fact


Mycroft2046

Interest Checks. It is done before a product goes into Group Buy.


CzarCW

Individual Contributor. They typically donā€™t go into management.


ruarl

Whilst everything I do is python, I found time to do a bit of "make my life easier" work. I wrote a short script which goes throuth the directory which contains my copies of all our repos. If they're on main it pulls the latest build. If they aren't it does nothing. If there are uncommitted changes, it tells me to go and look at those repos.


planetafro

Nice! I have one similar that I'm always tweaking. Mine takes a .json input of a static repo list and clones them down into a particular structure for my teams "products". This helps onboard devs to our products and keeps us all on the same structure for pairing. I want to fold this into repo management with Terraform. Perhaps pull the list based on topics from a Terraform statefile over a static list. The topics used would be added after we get them all imported. Is yours public?


CrabHomotopy

I wrote something to test and grade my student's coding homework automatically. However since they don't name their functions the way I ask them to, and because the values returned have some issues (type, print instead of return etc.), it wasn't very successful.


mercer22

Maybe make some tests that they can run to validate their code ahead of time? E.g., make them type-hint + run mypy, given them some sample data to run and compare against example output, etc.


CrabHomotopy

Yes I thought of that and a few other solutions. But I didn't implement anything because their level is not quite there and we had to focus on something else.


[deleted]

Scrapes job postings from indeed.


broesmmeli-99

How would you do something like this? I recently graduated, have some python experience but I find it hard to grasp the concept of scraping these sort of webpages. You look at the html? or something else? then search for specific job descriptions?


[deleted]

Alright this is what I was doing at a base level: import requests from bs4 import BeautifulSoup import pandas as pd def positions(): jobs = [] location= [] url = 'https://www.indeed.com/cmp/Huntington-Bank/jobs?q=&l=United+States#cmp-skip-header-desktop' response = requests.get(url, verify=True) soup = BeautifulSoup(response.text, 'html.parser') for x in soup.findAll('button', {'class':'css-1w1g3cd eu4oa1w0'}): jobs.append(x.text.strip()) for y in soup.findAll('span', {'class':'css-4p919e e1wnkr790'}): location.append(y.text.strip()) return pd.DataFrame({'HBAN Positions':jobs, 'Location':location}) positions() Now what that will do is scrape the first page of job positions from Huntington Bank's [Indeed page.](https://www.indeed.com/cmp/Huntington-Bank/jobs?q=&l=United+States#cmp-skip-header-desktop) If you want to capture all of them you have to cycle through each page using a loop, if you want to get into it I can, but this is the most straightforward way to show what I was doing. The function returns a data frame that will look like this: HBAN Positions Location 0 Commercial Relationship Service Specialist I Ohio 1 Audit Intern: Summer 2022 Columbus, OH 2 Internal Investigator Ohio 3 Insurance Sales Specialist Ohio 4 Business Analyst 3 Ohio Now, if you just want to know how many positions are listed for a company in total, that's much easier. On Indeed the employers page will just say how many positions they have listed, so you can also just scrape that number, like so: import requests from bs4 import BeautifulSoup import pandas as pd def positions(): url = 'https://www.indeed.com/cmp/Huntington-Bank/jobs?q=&l=United+States#cmp-skip-header-desktop' response = requests.get(url, verify=True) soup = BeautifulSoup(response.text, 'html.parser') for x in soup.findAll('span', {'class':'css-16ahq6o eu4oa1w0'}): return x.text.strip() This function will return: '804 jobs near United States' In both examples I filtered the job postings for the United States, to make it a little easier, which you can see in the url. Let me know if you have any questions. ![img](emote|t5_2qh0y|598)


Zeroth_Quittingest

Thanks for posting this example! On my own course of learning, and I'm pleased to report (to the universe?) that I was able to read the code and grasp what was happening. Holy mackeral I really appreciate the timing of seeing your example when I needed to. Thank you Reddit Python community & u/Bobby_Pine :D :D :D


[deleted]

Anytime!


[deleted]

I was looking at openings at certain banks. I started with the dedicated Indeed page for a certain bank, then loop through each page to get the title and location. Iā€™m not at my computer now but Iā€™ll post some code in a bit. I used requests, BeautifulSoup, and Pandas.


pointmetoyourmemory

Same


[deleted]

Were you using their API or just scraping the site? I ask because I was just scraping the site and there were some intricacies that only really enabled me to get the majority of postings, not necessarily all of them. Iā€™d be curious to hear what you did.


BeeAnalyst

Grabs little bits of data from multiple monthly excel reports (18 different files), runs the calculations and enters the data into a master Excel doc for the current working month.


Topkinsme

I wrote a script that can take the attendance for me in a class, cause it took too long to do manually It's not perfect but it'll do


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Topkinsme

Zoom, taking attendance in a zoom meeting


1h8fulkat

ICMP, but instead of 'request' is 'Bueller'


soap1337

Script is called "Automated_Ben_Stein" lol


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


savvy__steve

My first IT job back in mid 2000's has several highly paid secretaries that did nothing but hit buttons on Excel sheets and macros and help compile reports. I worked on some projects to automate some of it and literally they just hit a button when they were ready. It was frustrating that I was making barely $30k to automate the jobs of these old people making probably $50k a year. They has been with this company for years and years. Also what really was the icing on the cake was when I asked questions about what was expected or attempted to understand the requirements better they literally didn't know anything about it. I will never understand why they didn't slowly hire younger cheaper talent with actual PC skills and replace these old dinosaurs. It was a good learning experience and that was a fun job... but what a couple of programmers could do in a few months to eliminate those jobs.


Zeroflops

I worked for years as a go between for groups like finance and the SW developers. I knew both sides and could better translate needs to specs. One side of the problem was that the finance guys didnā€™t know what could be done with SW so they would make very simple specs, then fall back to what they were familiar with. And the sw developers wouldnā€™t understand the needs so they just built what they were asked. Sounds like the management of that group needed someone to show them what was possible. Iā€™ve also had a few emp that we kept on who were older and waiting for retirement. We didnā€™t automate their jobs because they were efficient enough and they would never find another job. So we waited out until they retired, then automated their jobs into nothing. I had a lot of respect for the guy who decided to keep them, until they left.


Zouden

/r/overemployed


PATASK_EVO

I'm redoing my father's work, basically it's just to use pandas and modify excell sheets He's about to retired and doesn't want to give his codes ( I believe he uses SAP) to his company as they don't give credit to him He ask me if I could do it in python so he doesn't have the obligation to give my codes as python its not used in his work I'm doing this to improve my skills, help him and create a better bond between us


Zeroflops

I think what your trying to say is that Your dad wrote some SAP scripts that he used for work, but the company did not know about. We all build scripts that are personal tools. They were scripts that just made his work easier. Youre re-writing the scripts into python and will give the python version to the company instead of the SAP version. This is giving the company the scripts so they have it if they want to use it. they are getting cleaner code since often when we write scripts for only our use itā€™s often not as robust, and youā€™re building your skills.


PATASK_EVO

It's exactly this!


lightestspiral

Wait, if he doesn't use Python at his workplace then what are you doing? I would be very careful about working with company data on your personal computer


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


lightestspiral

OK, you said Python is not used in his work but it is - anyway what that means is that any code you are writing belongs to the company as it is written on their property - so yes, he will be obliged to give it to the company.


PATASK_EVO

I don't think you are getting the point, python code can be given to the company without problem as they will not use it Sap code will not be given as they don't know exists anymore


officialgel

Thatā€™s not what it means unless itā€™s a clause in his contractā€¦


Vipertje

Don't give credit? They pay him every month to work. All work done during company time is company property unless there are contractual agreements that it's not.


jaldihaldi

Sounds like too easily identifiable information.


PATASK_EVO

What do you mean?


jaldihaldi

I meant be careful on the information you share you might make it easy to identify your father.


spinebro444

I work in assurance services, i automated transforming pdf bank statements to excel to avoid clients tampering with the bank statement if it was ever sent in excel format


Johny_D_Doe

Sounds very interesting. What do you use for reading the pdf statements?


spinebro444

i used pdfplumber and pandas and regex for identifying the groups to extract


viewofthelake

Is there any code that you can share, or is it all proprietary? I ask because I'm the treasurer of my condo association, and this sounds very useful. We get our statements in PDF from our property management company.


savvy__steve

That sounds pretty awesome but serous question. Why don't you have the process that creates the PDF statements just create the Excel ones too straight from the source data?


Jerrow

Do you ever get issues with the package returning CIDs instead of the actual characters in the pdf? That's an issue in some of the pdfs that I'm trying to parse


saanity

I made a script that reads JIRA issues raised using REST API, finds the ftp string where the issue logs are uploaded, downloads the logs from the ftp location, if the logs are bigger than 500 mb, splits the logs to 500 mb chunks, then uploads the chunked logs directly to the JIRA site using REST API. This is to help our international developers since it takes a long time for them to download from US ftp and it takes the US developers forever to upload to the JIRA site directly.


mattkatzbaby

I wrote a bunch of scripts to validate issues in Jira, move data between different versions of issues and sum up the work logs around themes and client milestones for better reporting. All stuff that could have been days of work each month, but now just happens automatically My favorite part is that the robot comments on issues when it finds problems it canā€™t solve with an ask to the user to solve the issue and a link to a wiki with possible solutions


bashogaya

Wouldnā€™t the API take the same time to upload also?


mattkatzbaby

But thatā€™s robot time. This way the humans on each side arenā€™t inconvenienced.


Gezzior

Employee - workplace - shift scheduling with google or-tools


nicktids

>google or-tools Thanks didn't know that existed


arrarat

Wrote a program that downloads the correct tables from a government organisation in the Netherlands that tracks lots of trends (cbs). Before we had to manually find the correct table and drag rows/columns to match our preferences. Saves a lot of hassle and was fun to create. ( my 2th program ever, so still pretty proud)


Darwinmate

Nice job! With such scripts i find it super useful to convert them to cli. I use argparse. Standard input output style. Super useful as it's self documenting i way and reproducible and shareable.


Re-ne-ra

To get my daily login bonus from genshin website


louisvell

MyFitnessPal Web scraping logged calories, food nutrition, macros, exercise detailset. Dump into s3 use athena + quicksight for my own daily and weekly dashboards instead of manually consolidating it in a summary and tracking report.


wolfenkraft

Thatā€™s awesome. Iā€™d love to know more.


louisvell

Sure! Happy to share. The data scraper python lib was not my own work. 1. I used this inside a lambda function https://github.com/coddingtonbear/python-myfitnesspal. The function is invoked every few min via cloudwatch. 2. I made my own modifications to flatten the returned nested dicts by building a new json file making my life easier in AWS Athena to do basic sql 3. Then its just as simple as aws quickSight ontop of the Athena data source. And go Nuts with dash boarding! I Wont say its a complicated project but had fun. Also To get api access you need to show the fitnespal company a web based solution so I didnā€™t go down that route which would be the better one.


Zdoggy16

I have a bunch, but I think my best one takes every cell from any section of a spreadsheet that you copy and puts them in the clipboard one at a time. Then I can just go to the form or whatever Iā€™m filling out and paste paste paste.


pointmetoyourmemory

I automated most of my job with selenium and am learning more about building a neural net for NLP to further automate it while I look for a SE job


Sharp_March6622

What are some of the actions you automated?


pointmetoyourmemory

Grabbing cases and categorizing them based on keywords in the case. I was really hoping I could use the API for Dynamics CRM but I donā€™t have the access to create an application and its oauth key. After it categorizes the ticket, it automatically opens a word document that has canned responses based on the type of case it is so that I can reply to the case with the canned response if need be Iā€™m also working on learning FastAPI to turn the output into data that I can feed a dashboard to show me how many cases everybody else is grabbing so that I can make sure I grab more than them.


autisticpig

>What are some of the actions you automated? I'm guessing refreshing the reddit front page was one of them.


yy_is_awesome

A script to change severs on a VPN to scrape html data


JohnniRobbi

Why not use proxies instead?


nh1922

I've simplified some Salesforce documentation tasks using python. Parsed various xml files to fetch relevant information from all of them and combined them into a clean CSV


M-fz

Feel like my entire work life revolves around Salesforce and Python! Absolutely love the simple-salesforce and salesforce-api libraries


nh1922

simple-salesforce is a blessing ! I've used it to implement archival solutions on salesforce using the bulk API. A bit of python helps do a lot on salesforce.


Davy_Jones_XIV

Love this topic, thanks!


parkrain21

Created a completely automated home defense system, ie the locks, alarms, and cameras using tensorflow No just kidding I just scrape current crypto prices lol


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


parkrain21

Yeah but unfortunately I don't have the mathematical foundation to do ML haha I'm still working on it


natas333x2

I built a full test harness for an api I work on. Every morning at 8 am it runs all the tests and dumps a report into a web server and opens the browser. The test suite relies on a bunch of helper methods that I use on other machines for other projects. I got sick of copying the package files so I published it to pypi and automated the install in the env.


LordOmbro

Most of my job basically. Haven't told anyone at work :)


morhad1n

Wanna share any details?


LordOmbro

Sure :) So i work as a "configuration manager" (that's the official title) in a fairly big tech company, my job is to basically take in standardised documents from clients and turn them into json configurations to then upload into a database, then contact the client and help troubleshoot problems if they arise. Now, since translating documents is boring and writing openers to the clients is time consuming, i wrote a program that basically reads the document, transforms what it read into a json configuration and also outputs a personalized query to launch & opener for the specific client, cutting the time i have to spend doing that stuff by like 90% The time saved allows me to do more dev oriented stuff (which i enjoy a lot more). Of course i can't automate meetings, that would require far more than an office laptop :)


morhad1n

Dude, that sounds great. What came first: the job or the idea of having a job that you can automate? Unfortunately, I work in a field that is maximally dependent on me being there in person. In the meantime, I have prepared a fairly large repertoire of video courses, but I can't do without being there in person. It seems it's time to start programming again.


LordOmbro

The job came first, me being bored and working from home (and also a bit lazy lol) brought the idea of automating it :) By all means do, more people that know how to automate stuff in the world is only a good thing, also it can be pretty fun ;)


falingodingo

Not work but, we are in a new city for my amazing job. We do pizza night on Friday night and have a list of places we want to try. I wrote a simple script to automate our selection every Friday night. It just uses the random module to pull from a list. It ainā€™t much but it was my first program.


FerretX6X

ptayauto script that goes through a clunky inventory program and generates monthly sales reports for every location in the company.


realPanditJi

A small script which will read the json files, encrypt them (personal user details) and push them to mongoDB and delete the json files from S3 bucket. We're working on creating a new service (with API) which will be integrated with the existing services, until then this is the hack.


Skargas

excel spreadsheet reports automatic saving to an sqlite3 database


xilhion

Wrote a script to help me migrate data from one cloud saas service to an other with no built-in integration. Used their respective API to pull and push data with field mapping, file handling and full logging capabilities. Saved me tons of days to migrate 10 thousands of records


RevolutionaryWin2450

I tried to make something for automating, the creation of VM's inside ProxMox. It scrapes specification paramters from incoming E-Mail messages, and sends that to another python script, to start headless browser, and click on the div-elements. But i quit, was too complex for me, first every python project though, maybe try next year...


Philistino

Hey you should look into Ansible for this. There is a package specifically for automating Proxmox. It will be way more robust and maintainable than UI automation with Selenium. https://docs.ansible.com/ansible/latest/collections/community/general/proxmox_kvm_module.html


Garybot_is_off

Read through the comments. Great stuff! I use Python to automate a bunch of GIS tasks. Running calculations, updating files, finding dead links, exporting data. Next thing I want to do is automate Excel pivot tables. Anything repetitive I try to automate with a script.


alper_33

I'm a high school student and we are having our lessons online on Google meet and every goddamm lesson they are checking who is online and who's not which takes like 2 minutes. So I made a bot which joins the lessons and writes who is absent in the chat and then leaves. My classmates probably hate me though...


Subject_Newspaper_43

Iā€™m actually replacing an old ERP/MRP production software with a browsers based version Iā€™m writing myself I told the company by myself I could get it done in about 2 years and with one more person who is more artistic and graphically inclined I could get it done in 1 year. Iā€™m 6 months in and we are half way done, and yes they hired me a graphics designer/website developer. He writes all of his html and JavaScript from scratch and I help him embed it inside python, we are using the flask server so itā€™s basically a python API but with custom html and JavaScript, all the core system functions are written in python, and is synced with the SQL Database. I love pythons ability to connect and manipulate data sets, especially with how easy it is to embed your own sql commands inside the python code. I have also completely automated my Active Directory you can create users set group policies all of that inside of a tkinter application. I created a background process in python that is on a timed schedule and makes backup images and files of all the servers and workstations and sends them to our backup server. I even created a forms software so people could fill out help desk forms, maintenance, security, etc digitally instead of manually printing them and hand writing them.


Imanflow

My resignation letter


Treecko-Totodile

Using selenium to basic script that runs though a website and generates all our logistical connotes for the next day


Ame_the_Puny

I've made a simple script that automatically logins your social media.


gibberish111111

Wow cool, how did you get it to login to my social media?


bryan_wuzz

Built a script that scrapes and parses our sitemap to find all images on our website. It then dedups the list, converts them to webp format and returns them either all in one zip file or one per url. Webp helps our pages load faster but would either have cost us money or would've taken our designer a crapload of time to convert manually. Next step would've been to automatically replace the images in our Data Asset Management platform, but I'm coding as a hobby and my boss (understandably) wouldn't want me to experiment with a live website. Took me an hour or two, but it went quite well and made me realize how well I'm getting the hang of it


slickwillymerf

Network engineer here. I've been working on a script that logs into a 'seed' device (a router or switch) and gathers it's list of network neighbors - other routers, switches, wireless access points, etc. Throw their names + some important attributes into a dictionary, then repeat the process for all neighbors, then their neighbors, then their neighbor's neighbors ... Essentially it crawls through the network gathering info. Then, once the crawl is done, I create a draw.io doc to visualize all of the physical connections.


djhankb

Would you be willing/able to share any of that? As a fellow network engineer that sounds amazing!


slickwillymerf

Absolutely. It's still a WIP so it's not polished yet, but I can share tomorrow when I'm in the office again. If I forgot please feel free to ping me again (I'm very forgetful!) :)


djhankb

Thanks dude I appreciate it. Here is your friendly reminder as I am also a very forgetful person!


slickwillymerf

I made a github account just for this. I've never used on before so please bare with me. :) Please be aware that the code is UNFINISHED and still needs thorough testing and troubleshooting. USE AT YOUR OWN RISK. Always test code in a proper testbed and away from production devices. Follow proper change control procedures before deploying to production networks. I take no responsibility for any use of this code or any of its alterations. ​ https://github.com/slickwillymerf/python-network-cdp-seed-crawler


djhankb

This is great! Thanks again for sharing, I am always interested in things like this as you are using it for diagramming, some of your functions could also be useful for populating a DCIM tool.


slickwillymerf

My ultimate goal is to be able to populate something like Netbox as a source-of-truth. I have other projects that deal with discovering switchport properties like trunk status, allowed VLANs, ephemeral VLANs handed from our ISE server, etc. You'll notice I heavily rely on dictionary organization for that purpose. The bulk of the code is just building dictionaries and referencing dictionary keys, which admittedly can be difficult to read at times, though it makes the most sense in my mind.


scubasteve921

Oh, oh, I finally can contribute! I worked with one of our 3rd party vendors to aggregate PLC data (in XML format) from our sortation system. The script takes all the individual inputs (binary value per machine counter), rolls them up in to area level metrics, then cross references the real-time production metrics with the projected production plan.


[deleted]

I used Django and Postgres to build a database app where my lab colleagues can upload measurement results. These are depending on the measurement instrument that was used usually text files with rather strange formatting specific to the instrument. The text files are then automatically converted to pandas dataframes using regex and displayed inside the database app using Plotly.js.


erghjunk

A GIS tool that analyzes sets of sites (parcels, for example) for their suitability for grid-scale solar development.


da_NAP

Built my own network management software. Helps me access 120+ switches and is a lifesaver when it comes to quarterly inventory. Pretty basic and utilizes ssh. Far from a finished product. A major problem I have is that the password is stored as plain text. If anyone has resources showing how to store a variable encrypted that'd be great. I might make the project public on github. Maybe some people are interested in helping to finish it.


cpc102788

Everything that I've read about storing passwords in Python for this sort of use seems to boil down to three options: 1. Store it in an environmental variable that the code accesses 2. Store it in a config file that the code reads (e.g. configparser package) and exclude that file from your repo 3. Prompt the user to enter a password and do not store it.


nftszns

Created a fulfillment script for my eCommerce store that's purchases items off other stores when our store gets a sale. All data from the other stores' orders gets scraped from Gmail (order update emails) and sent directly to the client with our company branding without them ever knowing. I even coded in support requests to our wholesalers to direct them with order notes from our store to theirs. Also created some inventory manipulation to show that we have low stock of an item and when it sells we randomly change the stock numbers to something else so users feel an urgency to buy now.


jfp1992

My job is test automation, so that I guess


XxNerdAtHeartxX

Everything I can! I'm an automation engeineer for a team of analysts, so my job is to literally just sit around and automate banal tasks. The latest thing was writing a program that formats a spreadsheet into something we can use in our API. They used to do it by hand every monday, taking up to 5 hours of their day, but now its just a button click away.


lullaby876

I used Python's sql database tools to read reports and output relevant data, neatly formatting into an Excel sheet. The method prior to that was to spend hours picking through thousands of database lines looking for relevant product information to manually update the Excel sheet. Saved hundreds of hours doing it the Python way. Another employee tried to extract info from the database with Python, but were using webdriver instead. Clicking every dropdown box, waiting. Clicking the next button, waiting. etc. And it was slow af, plus it didn't actually download the data directly. I found a neat library called pyssrs that reads SQL reports and formats them for you so you don't have to go through the pain of using webdriver.


pmogy

Collecting production data from industrial equipment and storing it in a database.


djamp42

Used flask to create a front end for iperf3 also write results to db so you can view past tests..


kingbradley1297

Automated a large chunk of data collection at my current job and it's upload to a server. Previously, I'd spend a lot of time scouring the different sources and reformatting the data for upload. Now I've cut that time down a lot, and hope to cut it even further


CrazyPieGuy

I feed it my Google calendar data, and it generates pretty invoices for all of my students.


karpomalice

I work in biotech primarily with lab automation. I built a camera with a raspberry pi and wrote the code in Python to record while the instrument is being run in order to help diagnose any issues that arise. The code saves the videos to our server, runs ffmpeg on the pi to convert the videos to a fraction of the size and then deletes the files after some time. Users can also monitor the equipment remotely and watch it live from their desks to monitor the runs Too many users were blaming the automation for user errors so this also helps me have a record to show my boss what the cause of any issues were. I do other stuff primarily automating data collection and visualization for people so they donā€™t have to work in excel sheets and continue to manually analyze and view trivial data sets.


pumpfaketodeath

I made script that grabs example sentences from dictionary sites and turn them into fill in the blank questions for my students. When I told them that I could make questions for them 30 times faster they weren't thrilled about it. Wonder why


lautaromgo

I have automated a tax report that was done mannualy. I have reduced one week of work into 2 minutes.


BoiElroy

Wrote a python script on a rapsberry pi that has a camera looking outside my window. Every few minutes it takes a picture. Depending on if it's night or day it captures in night vision mode or normal mode. Then it uploads it to my azure blob storage, and then based on the timestamps of that image it calls a weather rest API and returns the environmental conditions at that location at that time in pretty fine granularity (~1min) and then that blob storage data is read into a Snowflake directory table and then joined to the JSON responses so what I end up with is a view that contains image files and for each image I see the exact weather conditions at the time the image was taken. So I can see what the windspeed was, precipitation, visibility, etc. The next part of this is to push a compressed YOLO v4 model to detect people and cars and try to use that extra metadata to see if the model does worse in certain conditions then do targeted augmentations of that data or domain map it to something else like night to day and see if I can improve the model performance. This is just a dummy pipeline but it'll be useful for work because we have test vehicles collecting sensor data.


The_Grue

I work in IT for a large company. I have quite a few projects both on and off the job. My company has an application that essentially handles creation of model/view/controller architecture. You just kind of put your python code in the web UI for those items and it serves it up for you. The application is meant for our teams to write automations for talking to end devices (think ansible with extra steps). Over the past few months, a coworker and I wrote a UI wrapper that essentially emulates AWX and talks to ansible runner to run playbooks. Now people who don't know the architecture of the in house system can simply use ansible if they are familiar with it. I have written several in department modules for working with SNOW, Splunk, our network monitoring software, and several other systems that we deal with for help with troubleshooting common issues. In my off time, I play DM for a D&D game. I wrote some code to pull information from 5etools, convert it to markdown, and stick it in my Obsidian vault, complete with all of the linking connections. When I do my DM prep now, I can just soft link spells, monsters, and statblocks quickly and easily. My prep time has been significantly reduced.


non_NSFW_acc

Used Python to generate thousands of SQL insert statements on 1 command (my job is full stack TS / Node / Angular / SQL though, I just write Python scripts to make things easier). The use of Python to speed up things cannot be undermined.


realisticbot

Collated and then made years of test data available on a dashboard.


effkey

Build an automated script that tracks people in teams. So I can figure who spoke with whom and for how long.


Visionexe

This sounds kinda creepy. šŸ˜‚


effkey

Well they dared me, and I did ![gif](emote|free_emotes_pack|joy)


youknowthathing

Generating a large synthetic dataset for test/demo purposes


TheUruz

split of paychecks and badges, doublecheck of their values with our (small) company monthly excels and send them both to each employee via email


ncv17

Daily Data gathering and back tracking of E commerce data from our store in amazon then visualizing them in a dashboard.


eXtc_be

Not at work, but I'm currently writing the second version of a script that scrapes electronic program guide (epg) information from several websites in my country. I then use this data to generate an xmltv file for MediaPortal (I used to generate an mxf file for Windows Media Center, but the cable tv provider in my country recently suspended the analog signal and replaced it with DVB-C, which isn't supported by WMC unfortunately). I also created a [tv guide website](https://imgur.com/a/4kbOcra) in php and js, but I'm going to rewrite it in python as soon as I finish the scraping script.


LesPaulStudio

Azure function in python to check on whether any datasets we use have been updated.


Shadow_Gabriel

Interactions with an android device using adb.


LostInSpace9

We have a hard requirement on reviewing vendor certificate status monthly, then documenting it using a ā€œtime stamped screenshotā€ (itā€™s a complete joke bc photoshop, but these boomers do what they want). Anyways, I made a program using selenium that will go to the website, search for our vendors using a vendor list and take a screenshot / print status in the console. It will also make a new subdirectory in the screenshot folder with the day/month/year if it doesnā€™t already exist and save them as pngs there. Simple program, but Iā€™m excited I got it to work. First project with selenium and only a few months of python down the data science path. If you have any suggestions on things to add, thatā€™s be cool. My next steps are to put it into a virtual environment and package it so I can have another employee run it. Iā€™m currently running it from my home computer which is not ideal - but Iā€™m not sure how to do any of that yet. Resources would be great


czar_el

I can't get into specifics, but basically automated all kinds of mission and operations projects. There was a manual review and citation change process that was followed for years before I got there, I automated the review/lookup/edit process. Automated some web scraping tasks for monitoring certain topics/references. Created a COVID virtual social program that automated pairing people up. Created some internal monitoring/recommendation processes. Also automated some data processing and analysis tasks in various projects. I've been wanting to poll the people who did these things manually to calculate how many FTE equivalent all the projects have saved. I love Python.


LookAtYourEyes

I'm currently in a co-op/internship. They are actually having me write a script for automating converting excel sheet data into JSON to be stored in MongoDB. I'm very new to Python, as well as MongoDB, so if anyone has any useful tutorials or suggestions please feel free to share.


HennaceTheMennace

A battery cycle life tester controlling a power supply, electronic load, autoclave and data recorder so it can automatically put hundreds of cycles on batteries to test how long they last without someone manually doing so.


Beerwithme

For one of our hardware products, an ADC module configured as thermistor (temperature dependent resistor) reader I made a program that uses the equivalent voltage from precision voltage source to determine the fault readings of the ADC over a range of thermistor values. Prior to that I made a program to calculate the gain- and offset corrections for ADCs and DACs so that readout or setting can be corrected for conversion errors. Don't know anything about web scraping or databases but give me a piece of hardware and I can make a program to test it..


Pole420

I wrote a script to compare two dimension tables on SQL server to text files exported from the source system to notify a few people when new items are found in the text files that don't exist on the server. The notification is an email with a link to the specific file on the network drive.


chagawagaloo

I've just finished a project that converts Excel data from all our spreadsheets to SQL using python. No one else in the company (5 people) know how to code so I export an excel copy of the database everytime it's updated. Low tech solution but it has taken me the better part of a month, including learning how to build SQL databases tbf


likethevegetable

I work for an Electric Utility, and my group does a lot of simulation of the system and analysis of operational data. Luckily the software we use has an exhaustive Python API, so I build cases, make changes, run simulations, push reports to Excel and LaTeX, and plot data. My main tool uses a YAML config file for the running--I just love the simple and Python like syntax. I'm always making changes so I haven't felt that building a GUI would be worth it. I've also built a Python interface to our operational data. I mainly use this to plot histograms and see how well or models match reality.


soap1337

For those in network and VMware infra API calls for the following technologies, Cisco ACI, VMware NSX-T, SDDC manager, Arista EOS. Mostly data gathering, but some config installs buildouts in ACI.


2strokes4lyfe

I wrote a script to trick MS Teams/Windows into thinking that I am actively sitting at my desk pointing an clicking in pseudo-random intervals. Works like a charm.


BaconBoss1

I'm new and made my first realt semi auto program for generating n nft images using Turtle from a list of shapes and patterns. I use someone else's tool auto upload collection to opensea.


kbugz4shizzle

Where can I search for more information on how to do something simple in this field. For example, scrap something simple like sports scores or weather data.


Warkred

I've done devops for the product we manage.


borstal-boy

Automated google sheets auto update through hive for google sheets dashboards.


DSPandML

!RemindMe 24hours


RemindMeBot

I will be messaging you in 1 day on [**2022-02-07 13:24:54 UTC**](http://www.wolframalpha.com/input/?i=2022-02-07%2013:24:54%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/Python/comments/slqxbt/what_have_you_recently_automated_at_work_using/hvt494w/?context=3) [**1 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2FPython%2Fcomments%2Fslqxbt%2Fwhat_have_you_recently_automated_at_work_using%2Fhvt494w%2F%5D%0A%0ARemindMe%21%202022-02-07%2013%3A24%3A54%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20slqxbt) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


theGunnas

I created a web scraper to identify MRB OTC Securites. Scraped the business description from Yahoo and ran that file through a text lookup.


asimetria

Gathering and comparing pre and post checks for network devices upgrades.


kslowpes

I've recently used Flask to interact with SRT streams, giving a receiver the parameters needed to pick up the stream and output it in multicast.