Software Engineer
Charleston, SC
Skills
C# Python Powershell Java Javascript SQL Angular Typescript
Continuous integration and continuous deployment (CI/CD) provides for a faster and more adaptive approach to extending and building a
great solution. Building a CI/CD pipeline that leverages Blue-Green deployments helps ensure new code going out is able to be tested and ran in
an environment that is identical to what customers/users will experience. A very integral part to having a comprehensive and efficient CI/CD implementation is
having regression testing.
Thorough regression testing as code is pushed out to the predecessor server helps ensure the code has been well tested. What is better than thorough regression
testing? Thorough regression testing that is completely automated. Automating the regression testing for a solution helps ensure reliability and security. This
helps cut down on errors that are introduced to customers/users.
For my team at Blackbaud, there were many parts that needed to come together to get CI/CD with Blue-Green deployments running. Luckily Microsoft's Azure has a lot
of useful information to help set a lot of this up. Having a smart team helps too. I did not drive too much of the Azure setup, but I was able to shadow many of the
pivotal steps needed like configuring the settings within the Azure Portal.
Majority of my work/effort for this project was building out our automated end to end testing. To write these, we used Playwright for Node.js. We used the Playwright
Test test runner. This allowed us to run tests across multiple browsers, execute tests at the same time, and capture screenshots during execution of testing. We
ultimately added 31 suites/tests to automate the testing of major functionality within our solution at the time of typing this. I suspect we will have more in the
future. We have added a Playwright test consideration within our planning of new features.
Using Python, I built a simple pair of scripts that automate the once manual process used to fetch US Public School and Public District data. It is important to
have reliable data that you can rely on. For US Schools and Districts, the government provides data a handful of times a year. For some time, my previous project
manager had to manually navigate to the government site to download the data. This was a tedious process as the website was pretty dated. The data was divided by
each of the states and territories in the US. Add about 4 clicks to get to each download for each state/territory made for a lot of clicks.
To help my project manager out, I built a selenium web driver based script in Python that went and downloaded all these files for him. This script did take a bit to
run. However, the time the script could complete in vs. that of my project manager was not very comparable. Something that would take a couple hours for my project
manager took about 20 minutes.
Not only did I automate the downloading of these files, I helped speed up and automate another process my project manager had to do. After downloading the files,
my project manager would then have to combine them into a singular file. I wrote the script to do this for him as well. These files were just Excel files, but
the formatting of these files was a bit strange in that there was a graphic within the first few columns/rows. This was pretty simple to get around and the whole
script took a very tedious process and cut it down tremendously while freeing up my project manager.
MFA and Password Changes:
Before the YourCause teams I was on got to Blackbaud, we did not have Multifactor Authentication (MFA) setup for
entering our site. After getting to Blackbaud, we had to implement some pretty important security measures like MFA. To implement this functionality,
we implemented within our logging in process, obviously. We have users pass a code sent to their user account's email. The user can opt to have us remember their
device which will not prompt them for an MFA code so long as they've entered it in within the last 30 days and have elected to remember their specific device. These
remembrances can be cleared at any time within the 30 days. After that time, they automatically expire. Although this functionality is limited to only supplying the
MFA code to the user account's email, the code is easily extendible for expanding to other options like an app. or mobile device.
I don't want to get into too many details with our password changes for obvious reasons. However, what I can say is that we added a lot more
complexity to what is an acceptable password. It helps ensure a set number of characters and symbols while also providing additional features only further increase the
strength and resilience of user passwords within our product.
Splunk, Mixpanel, and other Logging:
Another important security measure we were asked to introduce to our solution was more robust monitoring. We implemented Splunk
logging so different actions and abilities get logged along with other security related and code related information. This help increase visibility into
the actions users take. Furthermore, we also adopted a similar technology for our frontend code. We implemented Mixpanel to log and record the steps and
actions users take and do. Obviously this information is available in the App Insights with the Azure Portal, however, Mixpanel offers a more user friendly
and customizable solution that can be used for a vast range of purposes including increasing security/visibility within our product.
Besides these two forms of logging, I introduced a job that scans our API's log table to monitor it so that if so many errors have happened over a certain time range
or if a there are any fatal errors, we are made aware immediately. This helps ensure we are "actively" monitoring our solution while allowing us to act quick and
reliably when issues happen to come up.
Serving files from our API:
Who enjoys having floating SAS tokens out there. That isn't good for many reasons but better yet, what if you didn't even need SAS tokens to
serve files to your users? This is an idea that further allows you to lock down access to the location of files for your product such that only certain IP addresses
can access it or maybe just a singular IP address can access it, like the IP address of where your site is hosted.
In order to do this, I built a set of authenticated endpoints that help to serve files to our users. Instead of having places return links to
the storage account in Azure, instead, our API returns an endpoint call to itself such that our UI makes a request to our API for the file. This allows us to access
the file and then serve it ourselves. Luckily our files have a size limit which helps reduce any slowness that could come from doing this for large files.
Primarily a backend/API developer, but I also work wherever my team may need me. Have been contributing to the frontend along with providing assistance with dev ops related tasks/work and even taking on in-house projects to help provide value to my team and business.
C# SQL Javascript/Typescript Angular Azure DevOps Security CI/CD Automation Testing Postman Python Powershell Git/Source Control Backend Frontend
Backend/API developer for YourCause's Nonprofit Solution. Added new functionality/endpoints and refactored/improved existing functionality/endpoints. I was eventually asked to help YourCause's Grant Product. YourCause was acquired by Blackbaud in 2019; Blackbaud Buys YourCause For $157 Million.
C# SQL Postman Python Powershell Git/Source Control Backend
Helped migrate/move on premiss applications and technologies to the cloud. Had to ensure existing data was kept functioning along with providing that new data was compatible with both the customers on premiss systems, and those we were moving to the cloud.
Java Javascript Oracle Database
Maintained and extended the website and web applications of Wofford College. Worked closely with other team members to task out and break down different assignments and projects.
ASP.NET Web Forms VB.NET SQL HTML/CSS Javascript/jQuery Python Powershell Git/Source Control Backend Frontend
Majors:
Computer Science Philosophy
Minors:
Math Art History
Titled "Denial of service lab for experiential cybersecurity learning in primarily undergraduate institutions", this paper is a result of a lab module created from denial of service work I performed while working with Xenia Mountrouidou and Xiangyang Li.
Cybersecurity is a broad, dynamic, and ever changing field that is difficult to integrate into undergraduate Computer Science curriculum. The absence of sanitized labs coupled with the requirement of specialized faculty to teach the subject pose obstacles many undergraduate colleges face in adopting cybersecurity education. In this paper we describe a pilot lab that is part of the education modules of the project CyberPaths. This project aims to offer a solution to cybersecurity education for primarily undergraduate liberal arts institutions. We present a pilot study that tested a denial of service learning lab in a general education course at Wofford College that has received very positive feedback from students. Read