248-905-1645

A little about myself

I'm a software engineer with over 25 years of experience. Most of my experience has been centered around emergency management for multiple clients within the Department of Energy. I've worked primarily with the Microsoft stack, but more recently with Python. For me, puzzle-solving is part of the appeal of discovering software solutions. I am hard-working, self-motivated, and passionate about software development. I'm always looking to learn new skills to add to my toolbelt.

Skills

Timeline

Present
December 2023
CyberGRX
May 2021
Webroot
April 2018
February 2018
Miapeer
August 2015
August 2009
AlphaTRAC
February 2006
May 2001
Apparatus
April 2001
December 2000
Cavion
October 2000
Blue Lizard Software
February 1998
Echostar
July 1997

Experience

CyberGRX / ProcessUnity

Senior Software Engineer
May 2021 - December 2023
Denver, CO

Full-stack development of web platform

  • My day-to-day work involved the debugging and the creation of new features for the entire development stack ranging from Terraform to React.
  • I worked on a team that was primarily focused on rebuilding the processes necessary to deprecate the existing graph database and instead use a relational database.
  • While the majority of the tasks focused on the back-end, my front-end experience positioned me as the primary developer responsible for virtually all front-end aspects of the work.

Enhanced company match

  • I had the opportunity to work with the same key personnel on this similar task as my Company Master work at Webroot. Keeping the previous foundational experience in mind I was able to redesign and improve upon the original work, taking it from an entirely SQL-based solution to one that could utilize the entire technology stack.
  • I incorporated additional techniques like transliteration and additional tools like ZoomInfo to further enhance the company matching process.
  • This feature was foundational for the key engineering goals to move our data and systems to use a relational database, achieve an automated customer bulk ingest process, improve the results from the company search page, and provide the initial integration component with ProcessUnity.

Bulk ingest

  • I took on the role of team technical lead to drive this feature, which heavily utilized my previous Company Master work.
  • This epic required me to spearhead the use of new-to-our-architecture features and technologies including AWS Lambda, AWS EventBridge, and ZoomInfo.

Merger integration

  • I was the sole developer from CyberGRX on the integration team working with ProcessUnity to unify the unique functionality of both platforms.

Miapeer LLC

Founder
August 2015 - Present
Westminster, CO

CAPARS (Computer Assisted Protective Action Recommendation System) upgrades and maintenance

  • I am currently working with Los Alamos National Laboratory for consultation and the active development of their multiple installations of CAPARS.

Financial web application (Quantum)

  • I designed this app to empower individuals in managing their finances through intuitive tracking and budgeting features.
  • This project was born out of necessity when other financial applications lacked features that I needed for my own money management. The original version was a desktop application written in VB6. The current version has a web application front-end with a Python REST API. I still use this for my personal finances to this day.
  • This is hosted on Microsoft Azure utilizing continuous deployment via Azure DevOps and Azure Container Registry.

Work with clients to bring solutions to market

  • I was the consultant, architect, and engineer for the full development and deployment of Courtesy Service's web application.
  • I worked with Nussentials to prepare their upgraded Wordpress eCommerce site for release to the public.

Webroot / Carbonite / OpenText

Senior Software Engineer
April 2018 - May 2021
Broomfield, CO

Research and development of Company Master initiative

  • I developed a new process for fuzzy name and address matching to improve duplicate company identification and prevention.
  • I evaluated and implemented various APIs and algorithms such as Google Places API, weighted scoring, and a new architecture for deriving and searching alternate names.

Continued improvement of eCommerce systems

  • My primary role at this company was a SQL developer. I extended functionality and debugged business logic within the database schema. This included customer data, subscriptions, billing, orders, website routing, and more.
  • I received reports and performed production data support for customers and product licenses in order to keep the data clean and processes running smoothly.

Marketo person import process

  • I worked with the marketing department to create a more efficient person import process to assist in the transition from Eloqua to Marketo. This carried forward to allow the rapid import of marketing contact lists.
  • The updated process performed data validation and normalization of 500k records every month, which improved processing time from 45 days to 1 hour.

Front-end development

  • I improved the user's search experience by incorporating Google's Geolocation service to allow for more casual address input.
  • I updated the email marketing opt-out form to use React. This work also helped prevent potential data-mining by providing the user with a generic message and performing the appropriate validation at the database.

AlphaTRAC

Senior Software Engineer
February 2006 - February 2018
Westminster, CO

Department of Energy EPHA (Emergency Planning Hazards Assessment) automation from client data to final report output

  • I was tasked with the maintenance and future development of the in-house software used to generate emergency management reports. I took this app through multiple iterations as a developer-only tool to something that analysts could pilot themselves. The original version was written in VB6 and used a substantial amount of Excel/Word automation. My focus was to keep that application running while working to replace the VB6 elements with C#. Years later, the project was reborn again using NodeJS and Python (See CAPARS below).
  • I worked with analysts and scientists to customize the requested EPHA reports to fit client needs.
  • I pioneered and implemented an approximation algorithm for chemical Threshold Screening Quantity calculation to improve accuracy and reduce unnecessary analysis.
  • I increased the performance and precision of EPIcode and Hotspot automation tools, including the creation of a tool to combine multiple outputs into a single mixture.
  • I developed a tool to calculate the evaporation of puddles of hazardous materials using client inventory.

Lead developer of CAPARS (Computer Assisted Protective Action Recommendation System) emergency management response and reporting application

  • I maintained and updated the original version, which is still in production. Notably, I solved recurring client network security conflicts which impeded system communication by streamlining from a dual server configuration to a single server with a hosted virtual machine. This had the additional benefit of simplifying server administration and deployment. In addition, I enhanced the wind field applet to be user configurable which allowed for faster custom client deployments
  • I created the next generation version for EPHA generation using new dispersion/consequence assessment modeling methodology. This performed modeling using real meteorological data from the site rather than theoretical wind conditions with simple straight-line models to more accurately reflect dispersion through complex terrain. The application was designed for parallelized and distributed computing with the intent to utilize the full processing power of a multi-core system and allow additional machines to "subscribe" in order to assist with the workload.
  • I recommended and custom built real-time emergency response and consequence analysis servers to client specifications and budget. On these servers I performed server administration including OS upgrades, repairs, and security updates.

Interactive emergency response training software (AlphaACT)

  • This was originally developed as a desktop application for analysis and production of Protective Action Plans. It later expanded into a web-based crisis decision training platform for first responders. I prepared the application for demo at EMI SIG after which I received special recognition for creating a Minimum Viable Product (MVP) system in an extremely short period of time.
  • I worked with an external company to evaluate the integration of their pattern recognition engine (PRE). This effort eventually turned towards developing a custom PRE in-house to drive the system to a definitive answer as quickly as possible. I also collaborated with external software teams to integrate the AlphaTRAC system into their custom user interfaces and Learning Management Systems (LMS).
  • It originally handled chemical and radiological hazard scenarios for the Department of Energy and grew to serve additional fields including the Marine Corps, fire service, rail operations, and law enforcement.
  • It was deployed to Azure but could optionally be installed on-premise in order to protect sensitive data.

Experience gathering platform (XCapture)

  • I developed this web application and REST API for converting SME after-action reports into training scenarios for AlphaACT, which was designed to be adaptable for various fields and implementations.
  • Additionally, I created an Android version of the application, enabling offline deployment to remote locations.
  • The project involved a question-and-answer style experience-gathering interview for transforming SME after-action reports into training scenarios.
  • I worked in collaboration with external teams, such as the IAFC (International Association of Fire Chiefs), to integrate the XCapture API with their user interface.

Implementation, refinement, and upkeep of several software processes and operations

  • I set up and administered an on-premises installation of Jira for issue management and process workflow adherence.
  • I worked on a company IEEE-compliance committee for Software Quality Assurance. This included implementing processes and writing system documentation.
  • Government contracts came with time-tracking and administrative requirements, which off-the-shelf software couldn't meet. I created an Excel-based timecard solution to improve accuracy between accounting, project managers, and employees. This timecard system was used daily by every employee for over 10 years.
  • I managed the evolution of our source code and company documentation from CVS to Subversion and then later split them to more modern specialty tools, Git and Sharepoint

Windows Server, Linux, and Azure administration

  • I became the primary individual responsible for any IT issues that came up. I built, purchased, and maintained workstations and servers, upgraded the network infrastructure, and implemented security protocols.
  • When the company shifted to a fully remote workforce, I played a key role by migrating the business's systems, infrastructure, and operations. This included moving web applications to Microsoft Azure, transferring company services to their cloud-based counterparts such as Quickbooks, and implementing other collaborative tools such as Slack.

Blue Lizard Software LLC

Co-Founder
February 1998 - August 2009
Westminster, CO

Water World

  • This was my first professional project.
  • I created a scheduling application for Water World which reduced scheduling labor from 64 hours to only 24 hours every two weeks.
  • Having under-aged employees posed the additional challenge of navigating labor law restrictions. This application was designed to improve scheduling accuracy and efficiency for nearly 1,000 employees while simultaneously preventing penalties by avoiding labor law violations.

Rehabilitation Services Administration

  • I developed a database application designed to track the treatment plans for offender rehabilitation.

Quantum

  • During this time period is when I created the first version of Quantum (mentioned above within the section for Miapeer LLC). This was my first attempt at working with fuzzy matching and predictive analysis.

Apparatus Sales Corporation

Tech Support I for the U.S. EPA
April 2001 - May 2001
Denver, CO

Tier-1 support

  • I provided comprehensive software technical support coupled with education to empower users with the knowledge and assistance they need for optimal software utilization.

Loaner-Equipment Tracking application

  • I rewrote and streamlined the "Loaner-Equipment Tracking" application because I recognized that the original application was practically unusable, which resulted in minimal user engagement. The objective was to optimize functionality significantly, ensuring improved efficiency and a more user-friendly experience.
  • This increased availability and saved time in tracking down checked-out and overdue equipment by replacing the existing paper-based process with an electronic process with status reports.

Cavion Technologies

Tech Support I
October 2000 - December 2000
Englewood, CO

Tier-1 support

  • I provided basic customer support over the phone and escalated as necessary.

Created the uptime report

  • I conceptualized and developed an automated report that compiled system logs and offered customers a comprehensive monthly review of their services. This report provided insights into the uptime and operational status of their subscriptions.

Echostar Communications

Reporting Analyst
July 1997 - October 2000
Englewood, CO

Streamlined equipment refund process

  • As a billing associate, I was shocked at how long it took to process refunds for returned customer equipment. I found that most of the time was spent waiting for inter-office paperwork delivery and signatures from select personnel. In my down-time, I started working on an application to electronically transmit and track progress providing notifications to the appropriate individuals for their "sign off".
  • I was able to reduce the refund processing time from 4-6 weeks to 4-5 days.
  • Through this effort, I landed a role as a Reporting Analyst and also received recognition as employee of the month.

Call center processes, metrics, and reporting

  • My focus in this role was on dynamic data import and export, report generation, and process automation.
  • I developed and maintained dozens of Access Database applications. These were used to perform business functions such as customer account maintenance, equipment refunds, and credit card batch reconciliation.
  • Due to the large number of databases that I worked with daily, I was an instrumental asset to IT in the early preparation of the company's data warehouse.

Management support

  • I provided both scheduled and ad hoc reporting support for all three call center management teams.

Interests

  • I always enjoy learning new development tools, techniques, and languages
  • Home improvement - It feels great to solve a problem with your own hands
  • I am learning to speak Tagalog
  • I enjoy all manner of games especially card games, fighting games, and RPGs
  • I'm told that I make a mean cup of coffee (even though I don't really enjoy coffee)
  • Apparently, I love commas