#231–232
Author: ExcelBI
All files (xlsx with puzzle and R with solution) for each and every puzzle are available on my Github. Enjoy.
Both puzzles this weekend are somehow related to logistics. In first one we need to complete data for invoice for each person placing order. We have classic transforming and joining problem. One tricky part is to add totals for rows and columns. Let’s do it.
library(tidyverse) library(readxl) library(janitor) path = "Power Query/PQ_Challenge_231.xlsx" input1 = read_excel(path, range = "A2:C5") input2 = read_excel(path, range = "A8:B15") test = read_excel(path, range = "E2:J6")
input = input1 %>% separate_rows(c(Items, Quantity), sep = ", ") %>% left_join(input2, by = "Items") %>% mutate(Amount = as.numeric(Quantity) * Price) %>% select(-c(Price, Quantity)) %>% pivot_wider(names_from = "Items", values_from = "Amount", values_fn = list(Amount = sum), values_fill = 0) %>% select(Name = Person,u, x, y, z) %>% arrange(Name) %>% adorn_totals(c("row", "col"))
all.equal(input, test, check.attributes = FALSE) #> [1] TRUE
This time we have stock levels for each store. But not all days are recorded. So we need to make complete ranges of date and cumulative quantity for each and every store. It is not hard, and we can do it again using only one pipe of code. Find it out.
library(tidyverse) library(readxl) path = "Power Query/PQ_Challenge_232.xlsx" input = read_excel(path, range = "A1:C7") test = read_excel(path, range = "E1:G13")
result = input %>% group_by(Store) %>% complete(Date = seq(min(Date), max(Date), by = "day")) %>% ungroup() %>% mutate(has_val = cumsum(!is.na(Quantity))) %>% fill(Quantity) %>% mutate(Quantity = cumsum(Quantity), .by = c(Store, has_val)) %>% select(-has_val)
all.equal(result, test, check.attributes = FALSE) #> [1] TRUE
Feel free to comment, share and contact me with advices, questions and your ideas how to improve anything. Contact me on Linkedin if you wish as well.
PowerQuery Puzzle solved with R was originally published in Numbers around us on Medium, where people are continuing the conversation by highlighting and responding to this story.