
Projects Completed at Macy's
-
Product Type & Google Product Category Overhaul - Successfully executed two initiatives to optimize the paid search program's data feed for campaign structure and accuracy as well as shopping ad targeting accuracy. For the product type, we worked with Paid Search and Dentsu to structure the catalog's product types for campaign management by going through each family of business and methodically organizing the best campaign structure and then building custom logic to transform product type values in the feed based on meta data from our Site Merchandising org. For the Google Product Categories, we leverages ML and AI to chose the best value from the Google Taxonomy list based on meta data we had available such as product title, description, brand, and detail information. More about the differences between Product Type and Google Product Category here. Once these were completed, we rolled out the changes across all our marketing channels. This resulted in better efficiency and ease in campaign setup and management as well as addressing relevancy issues of items in search results as well as display and targeting.
-
Dead Net Profitability - Tasked from leadership to enable our marketing teams to target spend towards items within certain profit brackets, my team worked with merchandising, supply chain, and finance teams to identify all the variables needed to be taken into account to come to an item's dead net profitability. Once these were identified, we worked with our devops team to source and incorporate said variables into our datamart. We then defined the calculated field and deployed a scoring system to be delivered downstream to our marketing channels' feeds so that the marketing teams could identify items in each zone and target campaigns accordingly. This led to the expansion of digital marketing's ability to allocate spend and optimize ROI.
-
Competitive Pricing Initiative - Worked with our pricing team, using McKinsey data sources, to derive a competitive price score for our catalog that used daily data to determine if an item's price was competitive or not against the market. Additionally, we began to report on competitive price insights from Google Merchant Center to our paid media teams for real-time adjustments in spend.
-
uCVR A/B Testing - Tasked from leadership to increase efficiency of media spend, we collaborated with our Adobe Analytics team to derive items' unit conversion rate over a 7-14-30 day period by collecting page views from each marketing tactic's traffic compared to the number of units purchased by same traffic source. This was an experiment that we then deployed as a data signal into paid media campaigns over a 30 day period to compare performance on conversion between campaigns that filtered out items below a certain threshold vs campaigns that did not.
-
Category Top Performers - In an effort to curate top performing campaigns, we sourced items' unit sales over a 7-14-30 day period and created a field that calculated the individual item's units sold against the rest of the items in it's category to identify the top performing items in each category in the catalog. This was then used as a data signal that allowed our marketing teams to create campaigns targeting just the top performing items.
-
Google Merchant Center Suspension Warnings - My team managed probably 1 major suspension warning each quarter. These were fairly routine and would spike around heavy shopping periods where Google's Trust & Safety teams would ramp up their audits. Essentially, if the item catalog infringes on any disapproval reasons by more than a certain percentage, the merchant's account (Macy's or Bloomingdale's in my case) would automatically be hit with a 30-day suspension warning along with the required data quality issues that need to be resolved. These reasons can range anywhere from mismatched pricing between the data catalog and the landing page on the site, mismatch in number of days quoted for shipping and delivery vs the landing page's site, item availability, wrong tax calculations, etc. The remedy for each suspension warning would vary but it was my team's responsibility to understand the root cause of the issue and then project manage the solution scope and delivery before any impact to the paid media team's operations. This could range from subtle edits within our immediate ability to having to communicate escalated changes on the site's landing pages, sourcing of additional or corrected data from disparate enterprise teams, etc. More about these warnings here. Editorial & professional requirements for Google here.
-
Google Promotion Support - Similar to the GMC suspension warnings, we would often have site merchants reach out asking for more information on why promotions submitted were disapproved by Google. Most often the reason was due to editorial requirements being overlooked by the team but in other, more severe cases, there were issues with how the promotion was setup in our PIM system, the items associated to the promotion, the pricing for discounts, BOGO, etc. So, my team would routinely educate and resolve said issues. Promotion editorial requirements for Google here.
-
Master Page Traffic Campaigns - At the time of this writing, both Macy's and Bloomingdale's have normal landing pages and master pages. Normal landing pages are dedicated to a single item for checkout. Master pages are pages that showcase a collection of items curated for themed/related items and requires a customer to do an additional click in their journey on an individual items showcased, which then loads that item's page for checkout. When these page types were rolled out, they were not intended for use in marketing and were solely intended for organic browse after the customer was already on our site. However, the analytics team saw that these master pages had a considerable amount of traffic and conversion and pushed to have them used in marketing channels wherever possible. So, our team began the process of sourcing all the attributes needed for these pages, along with consulting each channel to understand how the values should be portrayed in their files (ie, if a master page isn't for a dedicated item, how would you state the price for that page? This was usually stated as a min-max price range). This was a year long project.
-
Data Dictionary - My team owned a very large data mart with 200+ attributes. When I joined originally, this data mart had no meta data about it except for disparate dev tickets in JIRA and some scarce documentation in Confluence. However, it was a big vulnerability for my team's position since we were the first to respond should there be any questions or escalations from marketing about data quality issues or definitions and deep dives for strategy. To resolve this, we kicked off a 2 year project to essentially groom the entire data mart and establish the source system of each field, it's transformation logic throughout the entire data flow, any business rules/logic, as well as it's intended use case. This was a very heavy and long project that ultimately required us to add additional resources to our team to help support.
-
OnPrem to Cloud Migration - The organization underwent a large transformation of shifting from onprem servers to Google Cloud Platform. This spanned several months for my team whereby we essentially had to advise on the details for every data transfer file and do thorough quality checks comparing the current production versions of files to UAT version of the migrated files. The difference between the two (not allowing for a simple lift and shift) was the fact that the onprem model used a home-grown UI tool that required the end user to use JavaScript Groovy for the scripting and the new files' method in the cloud would be using explicit SQL language. So, the translation between the two was sufficiently enough difference to require a lot of time and attention throughout the process.
-
Google Cloud to ProductsUP Migration - At the outset for this project, we were operating under a model whereby, my team would do all the requirements gathering from 3rd parties for what data they needed and the custom format/logic for such. We would then script and test the file and hand over the final product to our data engineering teams to schedule, run, and store the file. We would also coordinate the file transfers to the end destination with our Managed File Transfer team. This process of working with multiple teams was an obvious opportunity to try and find efficiencies. So, we submitted RFPs with a number of solution platforms and moved forward with partnering with ProductsUP. After the contracts were signed, we underwent a 3 month project of translating all the files that were in scope for the initiative into transformation logic in the ProductsUP platform and conducting thorough UAT. Finally, we coordinated with the internal teams for cutoffs and transitions dates to avoid any interruption in marketings operations. This project ultimately reduced the company's number of dependencies and time requirements for onboarding new partners and diagnosing/resolving production issues.
-
Affiliate Migration - Rakuten to Commission Junction - In the 2nd half of the year, with only 3 months before hitting code freeze and peak holiday sales season, we were tasked with transitioning all the product catalog files, returns, shipping, and sales files from Rakuten partner to Commission Junction. This was a large undertaking as we had a tight timeline and plenty of need for UAT. It was a cross-functional achievement that spanned my team, our data engineering teams, and the partnership of Commission Junction team. Additionally, we were tasked with implementing their tracking pixel for better visibility and alignment on transaction data.
-
SEO Migration - Yext to RIO SEO - Another migration requirement in the same half of the year as our Affiliate migration, we had to scope and execute a full lift and shift of our data files and local store front pages and data from Yext to RIO SEO. This was conducted across the board for both Macy's and Bloomingdale's and required a high level of coordination and management across the enterprise teams and RIO SEO teams.
-
Cookieless Future Audit - In 2024, prior to Google's announcement that they would further delay the deprecation of the 3rd party cookie in Chrome, I was tasked with doing a full collection of 3rd party partners that could potentially be impacted by Google's deprecation and solicit feedback from each as to: A) What data they were currently collecting/using, B) How they were using this data, C) What contingency plans or solutions did they have planned, D) What the impact would be to their operations once the cookie was deprecated. For each partnership, I documented, tracked, and hosted discussions and solution sessions between their teams and Macy's teams. This was a great project that exposed me to many new partners and tactics not previously familiar with. Ultimately, we tracked and managed 23 distinct partners that would be impacted and successfully planned and executed alternative solutions to resolve or minimize data loss.
-
Data File Frequency ROI - Tasked by leadership to explore improving data quality decay, we explored solutions for increasing the frequency of data file transfers for our catalog to address the issue of items going out of stock intraday on site but still being advertised on marketing channels as well as the desire to begin intraday price changes. Our first effort was to assess all our marketing partners and their ability to receive catalog data via APIs. After this was a resounding no-go, we explored the option of increasing the frequency of data files being delivered. This was a lengthy process of discovery and ultimately resulted in being cost prohibitive. But, a great exercise nonetheless because we did discover that we could setup an API call to Google to send real-time updates on items' availability as they went out of stock on site and minimize the waisted ad spend for those items!
-
Meta CAPI & AMAPI - Successfully implemented the Conversion API (CAPI) and Advanced Measurement API (AMAPI) for Meta on our site after receiving request from senior leadership. These two projects required us to engage with several different teams internally to bring them into the conversation, sell the value add, gain consensus and sponsorship for resources, and implement and conduct UAT. Teams involved ranged from Legal, Risk and Security, Engineering, Tag Management, and Meta team's partnership. From these implementations, we were able to get better conversion and measurement data for our social teams.
-
ERP Implementation
-
Enterprise Accounting & Consolidation - Financial consolidation and reporting across the business units.
-
Financial Management - Complete view of enterprise to be compliant with customers, suppliers, and regulations. Allows for accounting for AR/AP/GL, reporting, consolidation, and cash flow management.
-
Human Capital Management - Tracking for employee expenses, time and attendance, labor, skills and training, incidents, and other records.
-
Suppliers & Purchasing Management - Purchasing functions include RFQ, inbound shipping, purchase orders, requisitions, and supplier pricing.
-
Supply Chain Planning and Collaboration - Visibility with integrated planning for supply and demand planning, distribution requirements planning, and master scheduling.
-
Production Finite Scheduling - Scheduling engine to determine which jobs to schedule to which work centers, with resource availability in mind.
-
Closed-Loop Quality Management - Maintain quality procedures directly from the control plan for increased process repeatability and predictability.
-
Inventory Management - Track and manage inventory in real time, stay in control with end-to-end traceability.
-
Production Management - Plant floor operations
-
Document Control & Management - Quality requirements, documentation, and record archiving
-
Audit Management - Digital quality management
-