My first boss called me. “I need you to report on expedites by letter code,” he said, “For all facilities. Can you do that?”
“Yeah, I should be able to do that. Our facility will be the hardest, of course, but you just add a query in there to reconcile the codes… yeah I can get that pretty easily.” For some perverse reason I cannot explain, our facility did not use the letter code field to assign letter codes, but an extra text field. The letter codes represent how important it is to keep the product in stock. It is a simple matter to use a query to select between two different fields depending on the value of another field, although I think I am still the only one on our entire team who has gotten a handle on this trick. (LetterCode:IIf([Facility]=”Us”,[CustomCode],[NormalCode]) — there you go.)
It was such an easy addition that while I was at it I added that would help anyone digging deep into the report cross reference the facts. Then I reorganized it. Then I cleaned it up some and sent it to my boss–a little overtime, but not too bad and a nice straightforward report. I told him if he liked it I would send it to everyone else.
The next day he called me. “When are you going to get that report done?” It would not be the first time he did not see something I sent him. “Uhh…. I am pretty sure I sent it to you, let me check…”
“I saw the one for this month. What about the last six months?”
“Oh sorry I did not make a note that it was supposed to be for six months, I can get that.”
“Okay, good, do that.”
“So, you are happy with how it looks, then?”
“Well, let me look at it…” I tried to deduce how he had seen the one month version without looking at it but I failed. “What is this column with the symbol on it?” he asked.
“That is for item numbers that don’t have a letter code,” I said.
“There’s a lot of them. Can you get them into buckets with the letter codes?”
“I’m not sure I can. I’m just pulling out the data that is there. I don’t know where else to get the data from.”
“Yeah but can you just break them up into groups by the numbers?”
“You mean by volume?” Letter codes are generally calculated from the volume of business done on the item. “I guess I could probably do that. Yeah, I can just get the volume from the shipments history. Letter code is by site, though, that could take a little while to run. Not that I can’t do it, it will just take some time.”
“How about if you just took the numbers here and just split them up proportionately?”
“You mean the volume of expedites? Wait, clarify this for me. Are we trying to get the number of expedites that were filed for each class of item, or are we trying to classify the items by how frequently they get expedited?”
“No, we just want to use the letter codes that are already there, but you should be able to correlate from the letter code and just break out the number of unknowns by percent.”
“Okay, I’m not sure that would work. Because we could get a lot of expedites on the highest volume items just because people order them a lot so there’s more traffic, or we could get a lot of expedites on the lowest volume items because it takes us longer to ship them so people get more impatient. So I don’t know we can say there’s a correlation between the shipment volume and the expedite volume.”
“So what do we need to do?”
“Well, if you want to know by the expedite volume I can ignore the letter code and just bucket the part numbers by number of expedites. You just tell me what buckets you want me in and I can do that. Or if you want it by volume of shipments I will have to calculate the yearly values and then work from there.”
“What if we just take the number that is already there? I think it would be pretty close. Just work them into the other buckets.”
“No, that’s two completely different things. Either we can go by volume of shipment or by volume of expedites. Tell me what you want to know, what you are going to use it for, and I can figure out how to do it.”
“Well… can we do both?”
“I just don’t want you to have to spend a lot of time on this. I don’t want it to be taking all of your time. So we just need to figure out what we can do.”
“Let’s start by clarifying what you want and I can figure out my options from there. Do you want them bucketed by letter code or by expedite volume?”
“Which is easier?”
At this point I about choked. Would you want to know the capital of France or the capital of Belize, whichever was easier? They are two different pieces of information. You want to know one or you want to know the other. Maybe even you decide that you need to know both. But if you just run with whichever is easier–what useful conclusions are you going to draw once I supply the information?
“I am pretty sure I can do it either way, I just need to know what you want so I can figure out how to do it.”
“That’s fine, we can do it whichever way, I just don’t want it to take all of your time and be a huge project. And I need it to be repeatable. Wouldn’t it just be easier to break these numbers up into buckets?”
“I would rather not be messing around with numbers on the spreadsheet, especially if you want it to be repeatable. I hate the other report that I have to work out on a spreadsheet.”
“Yeah, I’m not sure all of the time spent on that other report is value-added.” Nevermind that on the other report I at least have a very clear idea of what data I am deriving, even if it takes me longer than I’d like.
“So what do you want to do for this report?”
“Can you just take these ones with no letter code and add them into the other buckets?”
“I can do that. I am just… not comfortable with the accuracy of the data that would result.” Meaning I will do that if you order me to, but you are going to have to be explicit.
“I think it will be pretty good. Good enough for what we need.”
“Okay. Tell me what buckets you want. What percent for each bucket, I mean.”
“Well, let’s see, if you take this number… that’s about… okay and add that… hmm….”
“How about this? Want me to come up with a solution and then check it with you?”
“Okay that’s fine.”
I decided I would just do the work to dig up the letter codes for all facilities. I started refreshing my tables to make sure I didn’t have blanks from missing data. That didn’t go quickly, and I didn’t have a connection to all facilities. So I investigated how to connect to one of the other facilities. Meanwhile I was letting the biggest dataset, for the distribution warehouse, refresh. I messed up the new connection a couple of times but finally got it running. However the warehouse still hadn’t refreshed.
I let it go for hours, thinking sometime it would finally pop up done. It didn’t. Then I remember that at headquarters they got a nightly export of all the items lists. I asked for that to be mailed to me, but even as a zipped plain text file it was too big to pass the size limits on our e-mail servers. I asked my contact to trim it down to just the items I needed, but the rather vague assent made me fairly certain I would not see the file.
A little later I realized I could just connect remotely to the headquarters computer and pass myself the file. This network “live” connection (rather than by e-mail relay) is always much slower than e-mailing, but I figured it wouldn’t absolutely refuse to transfer the file. And it didn’t. It just asked for 90 minutes to do it, which was more time than I had left in the day.
So I also opened a help-desk ticket to ask that the file be sent by FTP to my local servers as well as the headquarters servers on the nightly batch.
The next morning I had the file. I thought I would just be able to drop it into my existing table but it turned out to be excessively troublesome, and it was a while before I got it loaded. Also my slave system that I was going to use to do the analysis had borked and the other work it was supposed to have done was undone. And actually it had failed for several days. So I had to go and patch that all up. In the meantime I talked to my boss and told him I would work on the report as soon as I got the problems on the system sorted out.
At last I got everything in its place and I ran my report. But now my own facility came up completely blank. I had no clue why so I ran everything again. Still blank. Then I remembered that we had our own special field and I had just loaded in the standard table. So I copied all the data from our custom field into the standard field, even though I don’t like playing with my base data (you never know when a built-in assumption will become obsolete). I ran everything again. Still blank. Then I realized that I copied the blank (because not even pulled) custom field from the standard data into the letter code field. So I pulled down a new copy of my sites data and then pulled that into the data set for all sites (nearly blowing the two gig limit on Access databases in the meantime, so compacting all the databases to clean up on my sloppy work). I ran everything again. Still blank. Then I realized that I was using a query that automatically looked at the custom field for my facility, which was precisely what was missing from the standard data.
So I took out that one little query, which was the only thing I’d done so far that I liked, and ran it again, and it worked. Only, even with all the data showing I still had too many blanks. So it was right back to square one for all practical purposes.
I ran out simple query to get the total last-year volume by item by facility, which actually did not take very long after all. Especially compared to some of the other things I had been doing (or attempting) lately. Then I ran off that into another little table for the total volume for each facility so I could get the the percentage of each item out of the total, which I did next.
Then I realized that was dumb, because no single item was going to be 80% of the total volume. I needed cumulative or running totals, from the highest movers to the lowest. But I knew from past experience that trying to run a calculation against a sorted dataset was probably going to be a failure in Access, because the sorting is not applied until all the data is calculated. You get results but they aren’t what you thought you asked for.
Then I tried to hack together a nested SQL query that would cleverly force the data to sort, then accumulate the total record by record. I couldn’t quite work it out, though.
Then I realized I was a moron because I was only trying to order by a number, and all I had to do was tell the query to add up the total quantity for all items where the quantity was equal or greater than the current quantity. And I knew how to do that. (DSum(“Qty”,”Items”,”Facility='” & ThisFacility & “‘ And Qty<=" & ThisQty)–for sure nobody else on the team of data analyzers at this large multinational company knows that one.)
Then I started asking people what percentage I should actually break the buckets at. Not my first boss; no. I asked some people in the facility. They weren't sure so I asked the expert from the distribution warehouse. He gave me the percentages–of dollar value of shipments. I know for sure that all our letter codes were calculated primarily on unit shipments.
Then I called my second boss and asked him if he really wanted a funky amalgamation of everybody's different definition of a letter code plus whatever I made up for the rest. He said the expert's definition was wrong and the proper definition was primarily by units, but also accounted for number of shipments (not just number of pieces) as well as the value. And he said I should not worry about any of that but just get an approximate calculation by units shipped, and if people wanted a more sophisticated number they should fix their own letter codes. It took us about five minutes to reach an understanding, not half an hour to reach a truce.