Compare commits
83 Commits
37ee3e99da
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
572bffc845 | ||
|
|
b879b6541a | ||
|
|
a9caa4c8f1 | ||
|
|
dce2d06602 | ||
|
|
9177d35caf | ||
|
|
665987395f | ||
|
|
5b03b8f156 | ||
|
|
bea82cb548 | ||
|
|
d749979aae | ||
|
|
b6a56554af | ||
|
|
2f6120fbd8 | ||
|
|
0c31596018 | ||
|
|
28cf35e0e4 | ||
|
|
7ebb6dee2c | ||
|
|
44de1377ca | ||
|
|
c3acf211c3 | ||
|
|
29528f23ac | ||
|
|
8be2fa3844 | ||
|
|
45d03e5ab3 | ||
|
|
2b3c7f4ca6 | ||
|
|
3ede691333 | ||
|
|
7b34b52e5e | ||
|
|
99c2269df8 | ||
|
|
b10a6250b1 | ||
|
|
4cb35dca33 | ||
|
|
840773fd16 | ||
|
|
9a22efd4b3 | ||
|
|
f5ea9e725e | ||
|
|
6a7378c265 | ||
|
|
5b0f1ab750 | ||
|
|
53930e66ac | ||
|
|
727099dbd9 | ||
|
|
98206fe6d4 | ||
|
|
57fdfb01cb | ||
|
|
1e32ec0988 | ||
|
|
6c061375ce | ||
|
|
f538945dff | ||
|
|
00e60657fa | ||
|
|
1d7a0ef7b8 | ||
|
|
4974127cef | ||
|
|
9dbc2ca205 | ||
|
|
e771ac9d9b | ||
|
|
52beb4196f | ||
|
|
5f62ea45f7 | ||
|
|
6efd55ae6a | ||
|
|
568fdd9a70 | ||
|
|
6dabe8bc66 | ||
|
|
86c9ad7653 | ||
|
|
4fc5caf228 | ||
|
|
b831e92e1f | ||
|
|
d799b122ce | ||
|
|
1d027d57fc | ||
|
|
49fc57029f | ||
|
|
955bdc78c1 | ||
|
|
a7e06e7a6e | ||
|
|
abca885f20 | ||
|
|
81b8ddc8b0 | ||
|
|
41e32d405b | ||
|
|
08c20ea6e0 | ||
|
|
ed4abd2d7e | ||
|
|
ee9c2c1ca0 | ||
|
|
8436426d3a | ||
|
|
cb973d60cc | ||
|
|
d09a8af5db | ||
|
|
6e623b617e | ||
|
|
bd45c334b0 | ||
|
|
192ff3878e | ||
|
|
b5cb827be2 | ||
|
|
149b753d22 | ||
|
|
709f4c0532 | ||
|
|
fde1415f34 | ||
|
|
3128ab673f | ||
|
|
85dd9a31e3 | ||
|
|
d4c162db33 | ||
|
|
adcb2417ff | ||
|
|
4d346a23db | ||
|
|
6398a4d468 | ||
|
|
42b587918e | ||
|
|
c187a03076 | ||
|
|
cdf5a38512 | ||
|
|
3a43f90431 | ||
|
|
7032666476 | ||
|
|
65d6c13016 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -1,2 +1,4 @@
|
||||
/.direnv/
|
||||
/.go
|
||||
input
|
||||
*.png
|
||||
|
||||
1000
day1/input
1000
day1/input
File diff suppressed because it is too large
Load Diff
140
day10/input
140
day10/input
@@ -1,140 +0,0 @@
|
||||
FJ7.FF--|--|-|J.FL7FL7LJ7F7-F.-7.FF7F|F-JJ-F----JF77.-JJFJ--|7FF7L-7.--LF.FJ-FL7JFF-77|F----F-|-F77LJ---F-L-L-77-.FJ.F77FJF7.LFJ7-|.FF-7--7.
|
||||
7.L--7|.|.||F7L7J.L7L7LLF-JFJ7JL-JL77L7|||.J-JF|LFFJ-JJFLJ..7-7-J7|.FL-|LF||.----L.|JF|7JL-L-7|7.L-.F-7---J-L-J|LLJ.-F--|LFL7-F.J7L|J.F|J||7
|
||||
L.|L|L--.7J||JF777-J-7JF7LFJJ|-J-|-|7F-7F7-|J7L|-FF..L7L|.F7L7.FL7JFL7-|--F-7LF.FJ.7.LJ-F--F|F7--.L77.|-LF7.|-L||FF7.7FLF7.L|-7-F7||.--|--LJ
|
||||
.LF7..|77L-L7J|.LF7.FF-LJFJJ7L7JFJJ.7JL|||.L.F7LF7.L.L-||7F7.LLF-7.LLL7J||LF7F|.LJFF7F|-LJ.-J|L7.J7F7F|.LFJ7JF.LJ7.F-|7L7JF-|-J-J.F7F|7|-7J.
|
||||
.FFJF-JLLJ7.L7-.JLF-FJJ7F7|F7.JJL-L|.|FL|J7L|JF7.L7L|J.L|-F777-7.F-J-LF-F7.7JF|7|F------JJF..J|L.FL||77--.FF7J7|77.|.LL.J-L-L-L7LFFJFJ-JL|LF
|
||||
F77|-|7J|L-7FF77LF|-J7-L7J|-JF|.L7.F---|LF7F7J|L7.|-|LF7|FJL-7FF7J7-FF|F7-FL-F-7-JJ--7F7.FL7.FLJ|7-||F7LFF-J|F-J-77J.L|F|7FFJ.-7.77-J.||FLF7
|
||||
F-7F7.7-L-J7||F7||.LJJ.LJ7LJ-|LJ-77.LL-F7||||FJFJ7JF|--J7|F-7L7||77.FJ|L||||7L7L7JJ-|LF7FF7F7F7F-7FJ|||F7L-7|7J.LJ||FJL7L|F-7.L|-|L-.FF|7.LL
|
||||
LFLJ|-7..-JL-F-7-77JLJ7JL-77.L7J-F7F7LFJLJ||||FJ77-FL7|L-||-L7LJ|JF7L7-LF-7F7-L7L-7F--J|-F-JFLLL7|L7||LJL7FJL777J-7F|-F|JLL-7-.|FLJ77F7||-F.
|
||||
-L7L7L7.FLJ|JL.LF|J|7L|FJF----F-7|||L7L--7||LJL--77L|LJ7JLJ-F|F-JF-7-F7-L7LJL7FJF-JL--7L-7LF7L|-|L7|||F--JL-7L77|7FLJFJJ7.FFJLL|F|.L7LL7|.-J
|
||||
.|-.F-.F|JL7-J.F|LF-F7-7L7F--LL7LJLJFJF77|||F----J77|.||7JLF7|||LL7|J||7-L--7|L7L7F7F-JF-JFJL7--L7||LJL---7||FJJFFJJ-J7FJ7-J-7JJ-77||.|-LL77
|
||||
7|7.||.-7.J|LJFJL7..LL7|F7-7JF7L--7FJFJ|FJLJ|F7F7LF77-J|F--JLJL7-FJ|FJL7F7-|||FJFJ||L-7L7FJF-J|7FJ|L7F----J-|L-7.|.L7LJJ|F7L-7--JF7JJ.77-L-F
|
||||
F|77FJ77L7-|..|.7J7F|JLL|--F-JL---JL7L7LJF--J|||L7||J7LLL7F---7|FJFJL-7LJL7FJ|L7|FJL-7|FJL7|F7F7L7|FJ|F7F-7FJF-JF77F|7L7-LLLJL-|-|J|7----L-J
|
||||
LLF-|-77-LFJJ.7.L|FF-F.|L|FJF7F----7L7L-7L-7F|||FJ||F7-7|LJF--J||FJF-7|F--JL7|FJLJF--J||F7|LJLJ|FJLJFJ|||FJ|FJF7||JFF77F7.L.FL-F.|.LJLL|JLF7
|
||||
.|LLLJ.|-F7|.F|--|-J7|-|-LL-JLJF--7|FJLFJF-JFJLJL-JL7LLF7LLL7F7LJL7|FJ|L-7F7||L--7L7F-JLJ||F--7LJF--J-|LJL-J|FJLJ|F7||7.|..--7-|7.L-F7JLFJ|J
|
||||
LJ7JLF-L7JL--L7JFJLJ-|LLJ|LF---JF7LJ|F7L7L7F|F7F7F--J--||7.FJ|L-7FJ||FJF-J||||7F7L7LJF--7|LJF7L-7|F--7|F7F--JL7F-J|||L-7F77JFJ.||.|FJJJ.|-JJ
|
||||
.---7LLLJL--7-JFLJF7-77J.-7L-7F-JL7FJ||-L7L-J||||L7F77FJ|F7L-JF-J|FJ|L7L7FJLJL-J|FJF7L-7LJF-J||FJLJF-JLJ||F7F7||7FJ||F-J||7-L-F-77FF7.F-|.L|
|
||||
-7.FL7.|L7|.|JL-J.F77LF-7LF-7LJ7F7||.||F-JF--J||L7||L7|FJ||J|FL-7||FJFJFJL---7F-J|FJ|F7L-7L-7L-JF7FJF7F7|||LJLJL-JFJ|L7FJL7.LFJ.L||.|-7-F77F
|
||||
F77|J|F|LJ7-F7|L7-JL-FJFJFJFJF--JLJL-J||F7L--7|L7||L7||L7|L-7F-7|||L7L7|LF-7FJL7FJ|LLJL--JF7L-7FJ|L7||||||L--7F---JF|FJ|F-JF7-F7-|7-|7|FL-7|
|
||||
LLL--L-L7-77LJ--||LJ.L7L7|FJ.L---7F-7FJLJ|F--JL-JLJ||LJFJ|F-JL7LJLJFJFJL7|FJL-7LJFJF-7LF7FJL7FJL7L-J||||||F7FJL7F7JFJL7||F7|L-JL7L|7J-7J.L|J
|
||||
FJ.J7LFL|-JF|.|FL-LF||L7|||F7F7F7||.LJF7FJL-7F7F7FF7L-7L7|L7F7L7F--JLL-7||L--7L7FJL|FJFJ|L7FJL7FJF7||LJ|||||L-7LJL7L-7LJ||||F---J-FJ-|.L77||
|
||||
|J-L---|L.||JFJ.|.FLFF-JLJLJ||||LJL-7.||L-7FJ|LJ|FJL-7L7||FJ|L-JL7F7JF7|||F--JL|||FJL7L7|7||F7|L-JL7|F-J||||F7|F--JF-JF7LJLJL-7FF7||.77.-LFJ
|
||||
L-7L|77FF-7LFJL-77|J|L-----7LJ|L-7F-JFJL--JL7|F-JL7F-J||LJ|FL---7||L7|LJLJ|F7F-JL7|F-JFJ|FJ|||L7F--J|L7FJ||||LJ|F7FJF-JL------JFJ|JJ7..F|-|7
|
||||
|.FJLLLJ|FLJ|.7J.||.F-----7L-7L--JL-7L-7F-7FJ||F-7|L7F7L-7L-7F--J||FJL-7F-J||L--7LJL7FL7|L7LJ|J|L-7-|FJ|FJ||L-7LJ|L7|7F--------JFJ|7F7.L7-7J
|
||||
FJ7FFJL||JJF--L7-L7-L----7|F7L---7F-JF7LJFJL-J|L7||FJ||FFJF-JL--7||L-7J|L--J|F--JF--JF-JL7L-7L7|F-JFJ|FJ|FJ|F-JF-J-|L-JF7F----7FJJ-|7JF|J.|7
|
||||
7..J7|FFL7F|7.FF7-F--7F--JLJL7F7J||.FJ|LFJF---JFJLJL7|L-JFJ|F7F-JLJF-JFJF---JL7F7L7F7L--7|F7|FJ|L-7|FJ|FJL7|L-7L7F-JF--J|L---7|L7J-JL-F.L7L7
|
||||
7--7L--7JFJL7|7|||L-7|L-----7|||FJL7L7|FJFJJF-7L--7FJL--7L-7||L--7FJF7L7L7LF7LLJ|FJ||F7FJ||LJL7L7FJ||7|L7FJ|F7|FJL-7|-F7L----JL-J|.JF|F-7.-.
|
||||
|FJ|7J|F-|JLF7FJL7F-JL7F----J|||L7FJ7||L7L--JFJ|F7|L-7F-JF-J|L7F-J|7||FL7|FJL-7FJL7||||L7|L-7FJFJL-JL7|FJ|FJ|LJ|F7.|L7|L7F---7F7LL7LL7L7.|.F
|
||||
FL7LL-FJ.|7.|LJF-JL--7|L----7LJL7||LFJL-JF---JF-J||F-JL-7|F7L7|L-7L-JL7.|||F--JL-7|||||FJL7F||FL7F-7FJ|L-JL7L-7||L7|FJ|FJ|F--J|L7L|.F|7LJ---
|
||||
JJF.L-7J7.--L-7L7JF-7|L-7F-7L7F7LJL7L---7L---7L-7|||F7F7|LJ|FJL7J|F--7|FJ|||F7F--JLJ|||L-7|FJL7FJL7LJ||F---JF-J|L7LJL7|L-JL---JFJJLJ7LJ..F.J
|
||||
F-L7..|JLJLF-7L7||L7LJF-JL7L7LJL7F-J.F7FJF---J-FJLJLJ||LJF-JL-7|FJL7-LJL7||LJ|L--7F7LJL7FJ|L7FJ|F-J|F7||F7F7L-7|FJF-7LJF7F7F7F7L7||.-7.77LJJ
|
||||
LJ-L7-F.FJFL7L-JL7FJF7L-7FL7L---JL-7FJ|L7L-7|F7L--7F-JL7FJF-7FJ|L7FJF7JFJLJF-JJF7||L---JL7L7|L-JL7F7|LJ|||||F-JLJFJ.L7FJLJLJLJL-JFFF---77|7|
|
||||
|F7L|-LFJ||FJF--7|L-J|F-JF7|F----7FJL7|FJF-JFJ|F7FJ|F7FJL7|FJL7L7|L7|L7|F-7L-7FJ||L7F7F7FJFJL7F-7LJLJF-J|||||F---JF7FJL---7FF7F7F7F7F|7LF7-J
|
||||
L-7F7..|.F7L-JF7||F7FJ|F7|LJ|F---JL7FJ||FJF7L7|||L7||||F-J|L7-|FJL7||FJLJJL7FJL7||FJ||||L7L7FJ|FJF---JF7|||LJL---7||L7F7F7L-JLJLJLJL-77.-|||
|
||||
FJ7FL7JFF||F7FJLJ|||L7|||L--JL----7|L7LJ|FJ|FJ||L7|LJ|||F7L7L7|L7FJ||L7F7F7||F7|||L7||||FJFJL-JL7L7F-7|LJ||F-----J|L7|||||F--7F7F---7|JFJL7|
|
||||
FJF--7|F7|||||F-7|||FJ|||-F7F7F---JL7|F-J|FJL7|L7|L7FJ||||7|FJ|FJL7|L7||||||||||||FJ|||||FJFF7F7|FJL7|L-7LJL----7FJFJLJLJLJ.FJ|||7LLLJJF7|L|
|
||||
|LJL7LFJLJLJLJL7||||L7||L7|||LJF---7|||F7|L7FJ|FJL7||FJLJL7||FJ|F7|L7|||||LJ||||||L7|||||L7FJLJ|||F7|L7FJF7F-7F7LJFJF7LF---7L-JLJ77FLF7L|77|
|
||||
L77F|FL-7F-7F--J|LJL7||L7|||L--JF--J||||||FJ|FJ|JFJ||L7F--J|||FJ||L7||||||7FJ|LJ||7||||||L|L-7FJ||||L7||FJLJ.LJL--JFJL7|F--J.F--7JL||||F7L7|
|
||||
|LJ-|JJFJL7||F--JF-7LJ|FJLJ|F7F7L--7LJLJ||L7|L7L7L7||FJL7F7|LJL7|L-J||LJ|L7|FJF-JL7|||||L7|F-J|FJ|||FJLJL---7F7|F--JF-J|L--7FJF7|LF|7|L7J-LF
|
||||
|LF|J7FL7FJ||L---J7L7FJL-7FJ|LJL-7FL---7||FJL7L7|FJ|LJLFJ|LJJF7||F--JL7FJFJ||JL7F7||||||FJ||-FJ|JLJ||7F7FF-7LJL-JF-7L--JF-7||FJLJLFF7|FJ..L.
|
||||
LJLF.FF-JL7LJJF7F7F7||F--JL7L---7L----7|LJL-7|FJ||FJF-7L7L--7|LJ|L-7F7|L7L7||F-J||||||||L7|L7L7|F--J|FJ|FJ|L7F--7L7L---7L7LJ|L-7JF-JLJ|F7-FF
|
||||
|7|JF|L---JF--JLJLJLJLJF--7L7F7-L---7FJL7F--J||FJ|L7L7L-JF--J|F7L7FJ|LJLL7||||F-J||LJLJL-J|FJFJ||F7FJL7|L--7LJF7L-JF--7L-JF7|F-J7L7F--J||-F7
|
||||
77LF7|7.|JLL7F-----7F--JF7L7LJL-----JL--JL7F7|||FJFJJL7F7L--7|||FJL7L--7FJ|||||F-JL------7|L7L7|LJ||F-JL7F7L-7|L---JF-J7F7||||F7-FJ|F--J|-J|
|
||||
FL7.J.FJ|J|-LJF----J|F--JL-JF----7F7F--7F7||LJLJL-JFF7|||F7FJ||||F7|F-7|L7|LJLJL-7F--7F-7|L7|7LJF7LJL--7LJL--J|F---7L---JLJLJLJL7L7LJF--JJ7J
|
||||
JJ-77-J7LFFF--JF--7FJL7F-7F7|F--7LJLJF-J||LJF7F7F--7|LJ|||LJ.LJ|||LJL7|L7||F-----JL7FJL7||FJ|F--JL7LF-7L------JL--7L7F------7F-7L-JF-JLF|-F7
|
||||
L|JL|L|F.F7L--7|F-J|F-JL7LJLJL-7L----J.FJ|FFJLJLJF7||F-J||F----J||F7FJ|FJ||L-7F7F7FJL--JLJL7|L7F-7L7|FJF7F--------J.LJFF77F7LJJ|F-7|7FFL77L|
|
||||
FL7-J-FF-JL---J||F7|L7F-JF-----JF7F7JF7|FJFJF-7F-J||||F-J|L7F7F-JLJ|L7||7||F-J||||L---7F--7||FJ|FL7LJL-J|L-------------JL-JL--7LJF|L7-7J.L7|
|
||||
FLL7LJFL-----7FJ||LJ7LJF7L-7F7F-JLJL-J|||FJFJFJL-7||LJL7FJFLJ||F7-FJFJ|L7LJL-7|LJ|F--7LJF7|LJL-JF7L7F7F7L---7F7F---7F---7F-7F-J-F-JFJ--7FLF-
|
||||
L|FLJ.F-7F---J|J||7F---JL7J||||F-7F---J|LJFJ-L-7FJLJLF-J|F---JLJL7|FJ|L7L7F--J|F-J|F7|F-JLJ|F---JL7LJ||L---7LJLJF-7|L--7||FJL--7L7FJJL|L7|||
|
||||
L-|7FFL7LJF--7|FJL7L7F--7L-J|LJL7|L7F7F|F-JF7F7LJF77FJF-JL7F7F-7FJ|L--7|FJ|F-7||F7LJ||L----7L----7L--J|F---JF---JFJL7F7||||F7F7L7LJ-F-F7L|FL
|
||||
L|LL-F-JF7|F-J||F7L7||F-JF-7|F--JL-J|L-JL--JLJL-7|L7L7|F--J|LJFJ|L|F7FJ|L7||FJ|||L7FJ|F7F7FJF----JF7F7|L-7LFJF---JF7||||LJ||LJL7L-7.LLJFFJJ|
|
||||
7J77LL--J|||F7||||FJLJL--JLLJL------JF---------7LJFJ|LJL--7L7F|FJFJ|LJ|L-JLJL-J|L7||F|||||L7|F7F7FJLJLJF7L7|FJF7F7||LJLJF7|L--7|F-J.F|-L.|FJ
|
||||
|-FF7JJ|LLJ||LJ||LJF7FF---7-F7F---7F7L-------7FJF7|F--7F--JFJFJ|-L7L---7F------JFJLJFJ||||FJLJLJLJF7F7FJL-J|L-JLJLJL7F7FJ||F7FJ|L7J.FJ-|7FF|
|
||||
F.F7J7LFJ-LLJF7LJLFJL7L--7L-JLJF-7LJL--------JL-J|LJF-JL7F7L7L7L-7|F---JL7F-7F7FJ7F-JFJ||||F---7F-JLJLJF---JF----7F7||||FJLJLJ-L-J.7J|JL-F-|
|
||||
|FL|.L7.|J|.FJL--7L-7|F-7L----7|FJF7F--------7F-7|F7|F7J||L-J-L7FJLJF7LF7|L7LJ|L-7L-7|FJ|LJL--7|L7F7F--JFF-7L---7LJ|||LJL7|F7F7F-7-|F--J7|.|
|
||||
|FFJFJLJJ.FFJF7F7L7FJ||FJF----J||FJ||F-------J|FJLJLJ|L7||F7LF-J|F--JL7|LJFJF-JF-JF-J|L7L7|F7FJL7||LJF7F-J-L----JF7LJ|F-7L-JLJLJFJ-|7JJJFLF7
|
||||
L|7FJ7FJLLFL-J||L7|L7||L-JF-7F7|LJ-LJL-7.F7F7-||FF7LFJFJ|||L-JF7LJF7F7|L-7|JL-7L-7L-7|L|FJFJ|L-7|LJF7||L------7F7|L7FJL7L---7F--JJ7LL||.77J|
|
||||
|..FJFJ7--77FFLJFJL-J|L---J|LJLJF------JFJLJL-JL7||FJFJFJ||F7FJL7FJLJ||F7LJF7FJF7|F-J|FJ|FJFJF7|L--J|||F7F7F--J|||FJ|F-JF--7LJLFJF|---|-F77J
|
||||
.FFLJFL|FF||-FF-JF7F7L----7F--7.L-------JF---7F-J|LJFJ.L-J||||F-JL-7FJLJL7FJLJFJ||L-7|L7|L7L-J|L---7LJLJLJ|L--7|||L-JL7||F-J-F77LJ|FF|LF7||.
|
||||
-F77.L7LL--.FLL7FJLJL---7FJL-7|F7F7F--7F7L--7|L--JF-JF7F7-LJ||L7F7FJL7F--JL7F7L7|L7LLJFLJJL--7L7FF7L7F7F-7L---J|LJF7F7L-JL--7|L-7JL-FL-J-7L7
|
||||
|FLF-L7..|.F-7LLJF------JL-7FJ||LJLJF7LJ|F7FJ|F7F7L--JLJL--7LJFJ|LJF-JL---7||L7||FJF7F7F7F7F7L7L-JL7|||L7L----7|F-J|||F-----J|F-J|L7|J-LJJFJ
|
||||
7|.....F-LL.LJ7F7L----7F7F7||FJL----JL-7||LJ|||LJ|F7F--7F--JF7L7L-7L7F7F7FJS|-||||FJLJLJLJLJL-JF7F7|LJL-JF----J|L-7LJLJF7FF7FJL7F7FF777|||FJ
|
||||
FJ77FF-7-.|.|-FJL-----J|LJ|LJL---------J|L--7LJF7||LJF7LJF7-||JL7FJF||||||FJL7LJ|||F7F7F---7F7FJLJ||F-7F7L-----JF7L----JL7|||F7LJ|FJ|-7--7JF
|
||||
LL7JLJ.||L7F-J|F-7F----JF7L------7F--7F7L--7|F-J||L7FJL7J|L-JL77LJ-FJ||||||F7|-F||LJ||||F--J|LJF7F||L7LJ|F7F7F7FJ|F-7F7F7LJ|||L--J|FJ7|L|J.-
|
||||
|LJ7L-L7||-FLFJ|FJ|F----JL-7F7|F7LJF7LJL-7LLJ|F-JL7LJF7L-JF7F7L7.J-L7|LJLJLJ||F-J|F-J|LJL7F7|F-JL-JL-JF7LJ|||||||LJFLJ||L7FJ||F---J|JL|F|7.|
|
||||
.F7FF7L---7.FL7|L7|L------7LJL-JL--J|F--7L---JL-7JL7FJ|F7FJ||L-J77|FJL-77|LF|||F-JL--J||.LJLJL7F-7F7F7|L-7LJLJ|L--7F-7||FJL-JLJF-7FJ|LL-J7F7
|
||||
7JL|J7FLL.LFF.LJFLJF7F7F7|L--------7LJF7L----7F7L7-||JLJ|||LJJJJL-FL7F-J-JF-LJLJLLJFL-F--7F--7LJFJ|LJLJF-JF7F7L--7LJFJ||L7F-7F7L7LJF77F||L-7
|
||||
|.LL7-J7|777|-L|-FFJLJ||L----------JF7|L----7||L-JFJL--7||7.J7..F-|-LJJFF|-LJFJ77J.F|-L-7|L7FJF7L-JF7F7L--JLJL7F7L--JFJ|FJL7LJL7L--JL7--J7JJ
|
||||
--|J.|LJ--L7-F--7FJF-7LJF-7F-7F-7F--JLJF7F--JLJ7F7L7F--JLJF7F-L-|-|-J.|-FF7LL|J-LF-7F7|FJL-JL-JL-7FJLJL----7F7LJL---7L7||F7L--7L---7FJFJF|77
|
||||
.F..-7JFL-J7FL-7LJFJ7L7FJFJ|-LJ-LJ7F---J|||F---7|L7LJJF7F7|L7J7J|7JFL-F7JL-JLLJ|LL7LJL7L--------7|L----7F-7||L7F----JLLJ||L7F-JJF7LLJL|J|LJ|
|
||||
FF-7F|FL.|--7.FJF-JF--JL7L-JF-----7L--7FLJFJF-7LJFJF7FJLJLJFJFL-|.F|FJL|FJ.F7LJ-|JL7F7L---------J|JF7F7||FJLJFJL-------7|L7|L---JL---7J||-F|
|
||||
F77|FF-F-7|.L7L-JF7|F-7FJF77L----7|F-7L7F7L7L7|F-JF|||F----JLJ|JL|-LLJ.F||LLLJLL-7J||L----------7L-JLJ|LJ|F7JL---------JL-JL7F7F7F7F7L-7J7J.
|
||||
LLFJ-F.|.LL-J-F--J|LJ7||FJL----7FJLJJL7LJL-JFJLJF-7|||L---7LF.|7JL|JFF7.77J.|7F77|FLJLF--7F----7L----7L--J|L7F-7LF--7JF7F--7||||||LJ|F-J7L..
|
||||
.F-7FJ7|FL|L7FL--7|7F-J|L-----7|L----7|F7F--JFF7L7LJLJF---J-7JJ7FFL7J-JL7|LJL7LFF----7L-7|L-7F7L7F--7L----JFJ|FJFJF-JFJ||F-JLJLJ||JLLJ|FF-J7
|
||||
FL-J7.JFF--7F--7FJ|FJF-JF7FF7FJL-----JLJ|L--7FJ|.L7F--J-F7.||7.F-JJJ-|.FL77.F7.FL---7|F7|L7-LJL7|L-7|F7F---JFJL-JFJF7L7LJL-----7LJ||LLL-JFJF
|
||||
7FF7F|7LJLLLL-7|L7|L-JF-JL-JLJF7F--7F-7.L---J|FJF7|L---7|L-7|FFLL-|.FF--7L-|---F-7F-J||||FJF-7FJ|F-J|||L----JF7F-JFJL7|F7F7F7F7L-7-7JFLJ.|L|
|
||||
L-|L|.F7F7JLF-JL-JL7F7L-------JLJF7LJFJF---7FJ|-|||F--7LJF-JF-7LL7LF7LLJJFL|||.L7|L-7LJ||||L7|L7LJF7LJL-----7|||F7L-7|||LJLJLJL7FJ-7-7LF7-FF
|
||||
J.JL|.LJJ|..L--7F-7LJL7|F7FF7F7F7|L--JFJF7FJ|FJFJ||L-7L-7L7F7JJFFF-|L7-|.FJ|F|FFJL7JL-7LJL7FJL-JF-JL7F-----7||||||F7|LJL------7||JFJ-JL|JL||
|
||||
F|7-|--L--F7|F7||FL7F7L-JL-JLJLJLJF--7|FJLJFJL7L7||F7L-7L7LJL7JFJ.F-77.L--.FF-7L-7|F-7L--7LJF7F7L7F-J|F----J|||LJLJLJF--7F7F--JLJL-77LF7.L7-
|
||||
FJ-F|LLJFL||FJLJL7FJ|L---7F7F-7F-7|F-J|L--7|F7L-JLJ||F-JF|F-7|J-L7F.|J-|L|..L7|F-J|L7L---JF7||||FJL--JL----7LJL-7F---JF7LJLJF7F7J7.L|FJ.|.J7
|
||||
.J|LJL|JL7||L----JL-JF7F-J|LJ-LJJLJL--JF--JLJL7F7F7||L-7FJL7LJ-FJFJF|..|-FF7FJ|L7FJLL-----J||LJLJF7F---77F-JF-7FJ|F7F-JL7F7FJ|||F7JL|7-7LF7|
|
||||
.FJ-77J.|7|L7F-7F----JLJF7|F-7|F-------JF7F7F7||LJLJL--JL--J7J.|---FLF77-FJ|L7|FJL-7F------JL7F--JLJF-7L7L-7|FJL7||||F-7LJ|L7|||||-F|JLL.LF7
|
||||
FL-|LF.FF-JFJ|FJL7F-----JLJL7L-JF---7F7.||||||||F7|F7F-----7F7.|.LF7..LF-JFJFJ||F--JL--7F-7F-JL----7|FL7L--J||F7LJ|||L7L7FJFJLJLJL--7-7L-.F|
|
||||
JJF|-L-FJF7|FJL7FJ|F-------7L7F-JF--J|L7|||||LJLJL-J||F-7F-J|L7-F-J|7FLL-7|LL7LJL7.F7F7LJFJ|F---7F-J|F-JF---JLJ|F7|||FJL|L7|F7F7F-7FJ7LJ|.JJ
|
||||
L|F|F--L-J|LJF-J|FJL7F7F7F7L7LJF7L---JFJ|||||F7F-7F7LJL7|L7FJFJLL-7|F7F7FJL7JL-7FJFJ||L7FJFJL-7FJL-7|L--JF7F7F7LJLJLJL-7L-J||LJ||FJL77J.FLLF
|
||||
F|J|LJFJJL|F7L-7LJF7||||LJL7L--JL----7L-JLJLJ||L7|||F--JL-JL7L7F-7|LJLJ||F7|F7FJL7|FJL7|L7|F7FJL7F7LJF---JLJLJ|F7F7F7F7L---JL-7||L-7L-7-L-|J
|
||||
L--7..L7F-LJL7FJF-J|LJLJF-7L7F-7F---7L7F7F--7||FJ|||L-7F7F7LL7LJFJL-7F-JLJ|LJ|L-7|||F7||LLJ|||F7LJL7FL-------7LJ||||||L----7F7|||JJL--J-J-F7
|
||||
LLLF-F--.FJLFJL-JF7L77F7L7||LJF||F--J-LJ|L-7|||L7LJL--J|LJ|F-JF-JF7L||F7F7L-7L7-|||||||L-7FJ|LJL7F7L-7F-----7|F7LJ|||L----7LJ||LJJ-|FJ7|LF-7
|
||||
.LL|LJ-|.7|.L7F7FJL7L-J|FJL----J|L---7F7L--JLJL7|F---7FJF-JL7FJF7||FJ|||||F7L7|FJLJLJLJF7|L7|FF-J||F7LJF---7|||L-7LJL7F7F-JF7LJJF|-L7L|-7|L.
|
||||
FF7J||7L-7FF|||LJF-JF-7LJF---7F7|F--7||L----7F7LJL7F-JL7L7|FJL7||||L7|||||||FJ|L-7F--7FJLJFJL7L--JLJ|F7|F--J|||F-J-F7LJ||F-JL---7J--7J|LJ7|.
|
||||
F77-|JLJFFF--JL-7L-7|-L--JF--J||||F7LJ|F----J|L--7|L--7L7L7|F-J|||||||||||||L7L7FJL-7|||F7|F7L----7FJ|LJL7F7LJ||JF7||F7|LJF----7|JJ7|J|LJJL-
|
||||
-JJF|.|F-FJF---7|F-J|F-7F7|F7FJLJLJL--JL----7L-7FJ|F-7||L7|||F7|||L7|LJ|||||7|FJL7F7|||FJ|LJL---7FJL-JF-7LJL-7|L-JLJLJ|L-7L---7LJ.LF7-F-|.LL
|
||||
LJ.7.|.LJL-JJF7LJL--JL7LJ|LJLJ|F-----7F--7F-JF7||||L7LJF7|||||||||FJ|F-J|LJL-JL77||LJLJ|FJF-----JL7F77L7L----J|F---7F7L-7L--7FJF77-|L7JF-LJ.
|
||||
JFLL--7L-L7LF|L7F7F---JF7L-7F-7|F--7FJL-7|L7J|||L7L7L7FJLJ||||LJ|||FJL-7|F7F7F7|FJL7J|FJ|FJF-----7|||F7L------JL--7LJ|F7L--7LJ.||F7|FJ77J|.-
|
||||
|7.|-7J.L|-FFL7LJLJF---JL-7|L7|LJF-JL---J|FJFJ||FJ7|FJL--7|||L-7|||L7F-JLJLJLJ|||F7L7FJFJL7|F---7||||||F----7F-7F-JF7LJL7F7L7F7|LJLJL7J|FF|J
|
||||
J.L--LJ-F|FF-7L----JF---7FJL-J|F7L-----7FJL7L7|||F-JL7F7FJ|||F7|||L-J|F-7F7F7FJ|||L-J|FJF7LJL--7LJLJLJ|L---7LJFJL--JL--7||L7LJLJF7F7FJJLF.|.
|
||||
FL.L-7|-|.LL7|F--7F7L--7||F--7LJ|F7F---J|F7|FJ|||L--7|||L7LJ||LJ||F7FJL7LJ|||L7||L7F7||FJL7F7F-JF----7L----JF7L----7F--J||7L-7F-JLJLJL7|L-LJ
|
||||
|F|7F7FFF7JFJ||F7LJ|7F-J|LJF7L-7LJLJF7F7LJLJL7LJ|F--J||L7L7FJ|F-J||LJF7|F7LJ|FJLJFJ|||||F-J|LJF-JF--7|F7F---J|F---7|L--7|L7F-JL-77J|L-F7.||.
|
||||
F--J|.F7||FJFJLJL-7L7L-7|F7|L--JF7F-JLJL--7F-JF-JL7F7||FJ|||FJL7FJL7FJ|LJL-7|L-7FJ7|||LJ|F7L-7L77L-7LJ|||F7F7LJF7FJ|F--JL-J|F7F-JJF7FFJL-|7.
|
||||
.|L7-FJLJLJFJFF-7-L7L7L||||L-7F7|LJF----7FJL-7|F-7LJ|LJL7FJ||F7|L7FJ|FJF7F7|L--JL-7|||F-J||F7L-JF7FL-7|LJ|LJL--JLJ7LJLF---7||||F7|L.|L-JJJ|J
|
||||
FL-77L--7F7L--JFJF-JFJFJLJL7FJ|LJF-JF-7FLJF--J||FJF7L7F-JL7|||LJ||L7|||||||L7F-7F7||LJL-7|LJL7F-JL-7FJ|F-JF7.F7F7F7|F7|F--J|||LJL7FFL7J77...
|
||||
FJLLLJ|-LJ|F7F7L7L-7L7L---7|L-JF7L--JFJF77L--7||L7||FJL-7J|||L-7FJFJ|L7|||L7LJFJ|LJL-7F7||F-7|L7F7FJL-JL-7|L7|LJLJL7|LJL-7JLJ|F-7L77||.--F77
|
||||
7L-.FL7-|L||LJL7L7FJFJF---JL---J|F---JFJL7F7FJ|L7|||L7F-JFJ||F7||FJ-L7||||FJF7L7|-F7FJ|LJ|L7LJFLJ||F--7F-J|FJ|F7F-7LJF---JF7J|L7L-J.F|-|-L77
|
||||
L-7LFF-.-FJL--7L7|L7L-JF7F-7F--7||F--7|F-J|LJFJ7|||L-J|F7|FJLJ||||F7FJ||LJL7||FJL7||L7L-7|FJF7F7L||L7FJL--JL-J||L7|F7L7F7L||FL-JFL7|.-7JL-J7
|
||||
||-J.JJF-L7F--JFJ|JL---J|L7||F-J|LJF7|||F7L-7|F7||L--7LJ|||F7FJ||||||FJL-7FJ|||F7||L7|F-J|L7||||FJ|FJL--7F----JL-JLJL7|||FJ|F7F77-FJFL--F7FF
|
||||
L77F7LFF7L||F7-L7|F7F---JFJLJL-7L--JLJ||||F7||||||F7L|F-J|||||FJ||||||F7F||-||||LJL7|||F7|FJ||||L7|L-7F-JL----------7|LJLJFJ|LJL--7-F-|L-J-L
|
||||
FL7-|FJL|7LJ||F-JLJLJF7F7|F----JF7F7F7|||||||||||||L7|L-7|||||L7||||||||FJ|FJ||L7F7||||||||FJ|||FJL7FJL-7F--7F------JL--7FJFJF----J.|7-.||-|
|
||||
LF7|L7J-|7FL||L--7F7FJLJLJL7F7F7|||||||||||LJ||||LJL|L7FJ|||||FJ|LJ||LJ|L7LJFJ|FLJ||||LJLJ||.|||L7FJ|F--JL-7|L------7F-7|L7L7|LF7-|7LL.77L-F
|
||||
L|LJJJLFF-7FJL---J|LJF7F7F7LJLJLJ||||LJ||LJLFJ|LJFF-JFJL7|||LJL7L-7||F-JFJF7L7|F--J||L---7||FJ||FJL7||F-7F-JL7F--7F-J|FJL-JFJL-J|..JJF|-7FJJ
|
||||
LLL7-F-LL7LJF--7F7|F-JLJLJL7F----J||L-7||JF7L7L7F-JF-JF7|||L7F-JF7|LJ|F7L7||FJ||F-7||F-7FJ||L7|||F-J|||FJL--7|L-7LJF7||F7F-JF7F7L77.F||L.J7F
|
||||
||.|-L-7-|F7|F-J|||L7F----7|L----7|L--J||FJ|FJFJ|F7L7FJLJ|L-J|F7||L7FJ|L7LJ|L7|||FJ||L7|L7||FJ|LJL7FJ||L7F--JL--J.FJLJLJLJF-J||L7L777-7.|F||
|
||||
FJ7.L|FJFLJLJL-7|LJLLJF77FJL-7F--JL--7FLJL7LJFJL|||FJL7F7L-7FJ|LJ|FJL7|FJF7L7LJ|||FJ|FJL-JLJL7L-7FJL7|L7|L-----7F-JF--7F-7L-7LJ7|FJJL-J---LJ
|
||||
||.FF-L.F7|-F--J|F--7FJL-JF-7|L---7F7L-7F-JF-J7FJ||L-7|||F7|L7L-7|L-7||L7||FJF-J||L7|L----7F7|F-JL7FJL-JL--7F-7LJF-JF7LJFJF7L7JLLJF7-|LF|.L|
|
||||
7|F--J7-|LJ.L7F7|L-7|L7F-7|FJL----J||F-J|F7L--7L7|L7FJ|||||L7L7FJ|F7|||7||||FJF7||FJL7F--7||LJ|F--JL------7||FJF7L--JL-7|FJ|FJ-|LL7JL|F-L.L-
|
||||
L7|J|-JFFJ|F-J|LJF-JL-JL7||L7F-7F7FJ|L-7LJL7F7L7|L7||FJ|||L-JFJL7||||||FJ|LJL7||||L-7|L-7LJL-7||F7F7F-7F-7|LJL7||F-7F--J|L7||F|J.L-JF|7|J|7L
|
||||
.|-7-J|F|FFL--JF7|F7F-7FJ|L7||FJ|||FJF-JF--J|L7|L7|||L7||L--7L7FJ||||LJL7|F--J||||F7||F-JF---J|LJLJ|L7|L7LJF7FJ||L7LJF-7L7|LJ7|FF-JLF-7-FJF|
|
||||
|LF-JFFJ|F-----J|LJLJFLJFJFJLJL7|LJL7L-7L-7FJFJ|J|||L7|||F7FJL|L7LJLJ-F7LJL--7|||LJLJ|L-7L---7|F---JFJ|FJF-J||FJL7L--JFJLLJ..FF-|-L7L|J||7|7
|
||||
L-LL-|7--|F---7FJF7F--77L7|LF--JL7F7|F-J.FJL7L7L7||L7||||||L-7L-JFF---JL-----J||L---7|F7|F---J||F7F7|LLJ.|F-JLJJLL7F--J7LL|7FFJL|JL-7|.F|-J|
|
||||
F---||.|-||F--JL-JLJF7L--JL7L-7F7LJ|LJF--JF7|FJFJ||FJ|||||L7FJFF--JF7F7F-7F-7FJL----J||LJ|F7F7|||||||F--7|L----7F-J|F7F7.F77FJLF.L7F7|7|7J|L
|
||||
J-|FF|-77LJ|F-7F-7F-JL7F7F-JF7||L7FJF-JF7FJ||L7|-LJL7||||L7|||FJF7FJLJ||FJ|FJL------7LJF-J|||LJ||LJ|LJF-J|F--7FJ|F7LJ||L-J|-77FL---J||7LJ.|7
|
||||
.F77|.FL||LLJFJ|FJ|F--J|||F-JLJL7||7|F7||L7||FJL--7.LJ||L7||L7L-J||-F7LJL7|L7F7F7F-7L-7L7FJLJF-JL7FJF7L-7|L7FJL7|||F7LJF7FJ7|L-..FL-|LJ|FF|7
|
||||
FJJF--F-L|FLFJFJL-JL-7FJLJL7F--7|||FJ|||L7|||L7F--JF77||FJ|L7L--7|L-J|F--JL7|||||L7|F-JFJL-7L|F7FJL7|L-7LJFJL7FJ||||L7FJ||L-JJ.L-J|FL-|-L-.|
|
||||
F-F-J||-F-7.L7|F----7LJ.F--JL-7||||L7||L7LJ||FJL---JL7||L7L7|F-7|L-7FJL---7||||||FJ|L-7L7F7L7LJ|L-7||F-JF7L7FJ|-||LJFJ|L||F|JF-.F-|J.||7JL|J
|
||||
--JLLJ7F|7F7FJ|L-7F7L---JF7F7FJ||||FJ||FJF-J|L7F-7F-7|LJL|FJ|L7|L-7LJF7F--J|||||||L|F-JL|||FJF-JF7||||F-JL-JL-J7LJ.FJFJ7LJ-L.JJFF-7..-JJJ7|.
|
||||
.J|LJ..-77.FL7|JFJ|L7F7F-JLJ|L7LJ||L7|||J|F7|FJL7|L7LJF--JL7L-J|F7|F-JLJF-7|||LJ|L7|L--7|||L7L7FJ|||||L---7JF---7.FL-J-7J.F|-LF7J|J-|.L-LL-7
|
||||
L7-7.F7J|.FJLLJ|L-JFJ|||JF7FJFJF-JL7LJ|L7||||L7FJ|FJF7L---7|F7LLJ||L7F--J7|||L77|FJ|F7FJ|||FJFJL7||LJ|F7F7L7|F--J-LJFJ7|-LJ|.|.L-|7.L-J7F|LJ
|
||||
FJJFF7J-.7JFF|.J.J-L7|||FJLJFJJL-7FJF-JFJLJLJF|L7|L7|L7LF-JLJL--7LJL||-F--J|L7L7|L7||LJF||LJFL7FJ||F-J|||L7LJL--7.LJL7FJ7JFFJ.F7J|F-7F.LJ-J.
|
||||
LJF-JL7F-F7-JJ.L|J7.||||L--7L-7F-JL7L-7L-7F---JFJ|FJL7L-JF7F-7F-JF--JL7|F7FJ.|FJ|FJ|L--7LJF---J|FJ||F7||L7L7F7F7|7--J||FL7-F7.|7-F-.FF7FL.F7
|
||||
|-|-7.JJ|F|F|LL..LF-LJLJJF-JF7|L7F7L-7|F7|L7F7FJFJL-7L7F7||L7|L-7|F-7FJLJ|L7FJL7LJLL7F7L-7L7F-7|L7||||||FJF||||LJ--J7.J-L7..|7LJFLJ.LLJ|7F7|
|
||||
|FLJL7|-|JLFJ.L77-LF-----JF7|||FJ||F7|||LJFJ||L7L7F7|.|||||FJL--J||FJL--7|FJL7FJF-7FJ||F-J.|L7||FJ|||LJ||F7LJ|L-7LJF7-.F|JL7JF-|7.|FF77|FJ--
|
||||
||-7F---||LL|..-JF-L--7F-7||||||FJLJ||||F7L7|L7|F|||L7||LJ|L----7LJL7F7FJLJF-JL-JFJL7|||7F-JFJ||L7||L77|LJL-7L-7L7-J-77|L7---|LL-7F-7J7|L7-J
|
||||
-J7.L7LFJFJ.|..|-L7LLL|L7LJ||LJ|L-77||LJ||FJL7|L7|||FJLJF-JF7F--J-F7||LJJJFJF7F7FJ7L||||FJF7|FJ|.|||FJFJF-7FJJF|FJ|FLF77J|FL7.77.L|7|.-JF|L|
|
||||
J.77-L-J-|..L.F.-J7FLL|FJ.F||F-JF7L7LJJ.|LJF7||FJLJLJJJFL-7||L--7FJLJ|J7F-L7||||L-7JLJLJL7|LJL7L7|||L7L7|JLJJ-FLJ7L7L7LF7FJ|L7L---LJ-LJFL-FJ
|
||||
LLJF-L-|7..7LFJ...-7JJLJJFF||L-7||FJ|.LFL-7|||||J|7|JFFJJL||L7F7|L7F7L7F7J|||||L-7L7JJ|J.LJ||JL7||||FJ-||JJJL-||JJ||.F7LJ-L--J.|.F7L-L-J-||.
|
||||
|L-JFJ7LF77|FJLFF7F7FF-JJ7|LJLFJ|||L|7FF--J|LJ||L-7J|-|JFFJL7||LJFJ|L7LJL7FJ||L7FL-J.L7-|7J||FL||||||7FLJ7.|.|LJL-7-FJ||FFJ-F777FLF.FJ7JL-J7
|
||||
F-J.F-FJLJ-J7--J..|L---J7LJLL7L7|LJL||LL--7L-7LJ77L-|LF.FL7FJLJ-FJFJJ|F--JL7|L7|7J||7FF7|-..FL-LJLJ||J|LLJF-77FF7-|7||LL|L7--JL77-F7F.F--7LF
|
||||
J.J-7.J-FJ-L7J.---J7.LLJ|-L7FJ7LJJ.LF7F|F-JF-J7FF7..JFJ7|LLJ7|7LL7|JFJL---7LJ.LJ..F|J7JF7LLJ.LJFJ7.LJJ77J.7.|-7JFF.7||FJJLLF7FJJF.LJL-|.LJJ|
|
||||
|-|--F.||7.FJ-|JJLFJL7|FL7.-7FLJL7--JLF-L7FJJ-L7.LL7.|.LL-|7LL7-LLJ-L-7F--JFJ7|7L7JJ|.F-JL|F7JFJ-|-|JJ|-7FF-F77JFJ-FFF--7LF7-7.7L-..|L|JF|JL
|
||||
--7-||-7LJ--.-F-7-L7JF77.F-.-|.|||FJ7.-J7LJ.|LL|7||LJLF7L-7-JFF7F7LJ.L||JFJ-.7J7-||F77J7.L|L7.7L-LF7.7JLJ7L|||J|.|-F-LJ7J7||L7FJF|-F7FLJFF7|
|
||||
J-.F7J-J|L-|-FJ-F--7LLJ-J-.F7|F--JJFJF|L-L7F7.|.L-FJ|LL-7F|LF7.-JLL.7.||.|J|F--||LL7|J-F7.|-|7L7.||||J-|.-.|J||--F.--L||F7.J-J-J|LFLFF7.|.F|
|
||||
|-LFJJJLLJLJ.FJF--7J-.LJL-JLLL-L-F-7-|J.|LJJ.LFJJ-LLL.LLLJ|JLJ|-LL-FJLLJ-J7--LJLL-LJ.|LL-FJJLLLJ.L|JJLF|-J-JLJ-FJ.FJ.L|LLJL..LJFJL-FJJLJ--FJ
|
||||
109
day11/dayEleven.go
Normal file
109
day11/dayEleven.go
Normal file
@@ -0,0 +1,109 @@
|
||||
package day11
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int64 {
|
||||
fmt.Println("hello day 11")
|
||||
|
||||
filename := "day11/input"
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("cannot read file ", filename))
|
||||
}
|
||||
|
||||
text := strings.TrimSpace( string(bytes) )
|
||||
starCoords := make([]StarCoord, 0)
|
||||
|
||||
lines := strings.Split(text, "\n")
|
||||
|
||||
nonEmptyRows := make(map[int]any)
|
||||
nonEmptyCols := make(map[int]any)
|
||||
|
||||
for rowNum, line := range lines {
|
||||
for colNum, symb := range line {
|
||||
if symb == '#' {
|
||||
starCoords = append(starCoords, StarCoord{rowNum, colNum})
|
||||
nonEmptyCols[colNum] = struct{}{}
|
||||
nonEmptyRows[rowNum] = struct{}{}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
emptyRowsAbove := make([]int, len(lines))
|
||||
emptyColsToTheLeft := make([]int, len(lines))
|
||||
|
||||
for rowNum, _ := range lines {
|
||||
if rowNum > 0 {
|
||||
emptyRowsAbove[rowNum] = emptyRowsAbove[rowNum-1]
|
||||
}
|
||||
_, isRowNonempty := nonEmptyRows[rowNum]
|
||||
if !isRowNonempty {
|
||||
emptyRowsAbove[rowNum] += 1
|
||||
}
|
||||
}
|
||||
|
||||
for colNum, _ := range lines[0] {
|
||||
if colNum > 0 {
|
||||
emptyColsToTheLeft[colNum] = emptyColsToTheLeft[colNum-1]
|
||||
}
|
||||
_, isColNonempty := nonEmptyCols[colNum]
|
||||
if !isColNonempty {
|
||||
emptyColsToTheLeft[colNum] += 1
|
||||
}
|
||||
}
|
||||
|
||||
var distanceSum int64
|
||||
for i := 0; i < len(starCoords); i++ {
|
||||
for j := i+1; j < len(starCoords); j++ {
|
||||
// calc distance between stars i and j
|
||||
starA := starCoords[i]
|
||||
starB := starCoords[j]
|
||||
|
||||
maxRow := starA.Row
|
||||
minRow := starB.Row
|
||||
|
||||
if maxRow < minRow {
|
||||
maxRow, minRow = minRow, maxRow
|
||||
}
|
||||
|
||||
var multiplier int64
|
||||
multiplier = 1000000 - 1
|
||||
|
||||
emptyRowsBetween := int64(emptyRowsAbove[maxRow]) - int64(emptyRowsAbove[minRow])
|
||||
|
||||
rowDistance := int64(maxRow) - int64(minRow) + emptyRowsBetween*multiplier
|
||||
|
||||
maxCol := starA.Col
|
||||
minCol := starB.Col
|
||||
if maxCol < minCol {
|
||||
maxCol, minCol = minCol, maxCol
|
||||
}
|
||||
|
||||
emptyColsBetween := int64(emptyColsToTheLeft[maxCol]) - int64(emptyColsToTheLeft[minCol])
|
||||
|
||||
colDistance := int64(maxCol) - int64(minCol) + emptyColsBetween*multiplier
|
||||
|
||||
distance := rowDistance + colDistance
|
||||
log.Printf("between stars %d %+v and %d %+v distance is %d. emptyColsBetween %d ; emptyRowsBetween %d\n", i, j, starA, starB, distance, emptyColsBetween, emptyRowsBetween)
|
||||
distanceSum += int64(distance)
|
||||
}
|
||||
}
|
||||
|
||||
// oh, i have list of all stars, i can just iterate over them and
|
||||
// only keep rowNums for which there are stars. yeah
|
||||
|
||||
fmt.Println(starCoords)
|
||||
fmt.Println(emptyRowsAbove)
|
||||
fmt.Println(emptyColsToTheLeft)
|
||||
|
||||
return distanceSum
|
||||
}
|
||||
|
||||
type StarCoord struct {
|
||||
Row, Col int
|
||||
}
|
||||
10
day11/example
Normal file
10
day11/example
Normal file
@@ -0,0 +1,10 @@
|
||||
...#......
|
||||
.......#..
|
||||
#.........
|
||||
..........
|
||||
......#...
|
||||
.#........
|
||||
.........#
|
||||
..........
|
||||
.......#..
|
||||
#...#.....
|
||||
184
day12/dayTwelve.go
Normal file
184
day12/dayTwelve.go
Normal file
@@ -0,0 +1,184 @@
|
||||
package day12
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strconv"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello day 12.")
|
||||
|
||||
start := time.Now()
|
||||
|
||||
filename := "day12/input"
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("error reading file %s\n", filename))
|
||||
}
|
||||
|
||||
result := 0
|
||||
text := string(bytes)
|
||||
text = strings.TrimSpace(text)
|
||||
|
||||
// testMask, testBlocks := ReadLine(".??..??...?##. 1,1,3")
|
||||
// blocksSum := 0
|
||||
// for _, block := range testBlocks {
|
||||
// blocksSum += block
|
||||
// }
|
||||
// testVariants := generatePermutations("", len(testMask), testBlocks, blocksSum, testMask)
|
||||
// fmt.Printf("for mask %s and blocks %+v\n", testMask, testBlocks)
|
||||
// for _, variant := range testVariants {
|
||||
// fmt.Println(variant)
|
||||
// }
|
||||
|
||||
var wg sync.WaitGroup
|
||||
lines := strings.Split(text, "\n")
|
||||
wg.Add(len(lines))
|
||||
|
||||
matches := make(chan int)
|
||||
go func() {
|
||||
wg.Wait()
|
||||
close(matches)
|
||||
}()
|
||||
|
||||
for i, line := range lines {
|
||||
go func(line string, lineNum int){
|
||||
mask, blockLengs := ReadLine(line)
|
||||
|
||||
blockLengthSum := 0
|
||||
for _, blockLen := range blockLengs {
|
||||
blockLengthSum += blockLen
|
||||
}
|
||||
|
||||
memo := make(map[string]int)
|
||||
variantsCount := generatePermutations("", len(mask), blockLengs, blockLengthSum, mask, memo)
|
||||
|
||||
log.Printf("%d : for line %s blocks %+v matches %d\n", lineNum, mask, blockLengs, variantsCount)
|
||||
matches <- variantsCount
|
||||
wg.Done()
|
||||
}(line, i)
|
||||
}
|
||||
|
||||
num := 0
|
||||
for match := range matches {
|
||||
num += 1
|
||||
result += match
|
||||
log.Printf("%d. intermediate: %d\n", num, result)
|
||||
fmt.Printf("%d\n", result)
|
||||
}
|
||||
|
||||
end := time.Now()
|
||||
|
||||
diff := end.Sub(start)
|
||||
log.Printf("> calculated for %s", diff.String())
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func myRepeat(line, sep string, amount int) string {
|
||||
acc := ""
|
||||
for i := 0; i < amount; i++ {
|
||||
acc += sep
|
||||
acc += line
|
||||
}
|
||||
acc, _ = strings.CutPrefix(acc, sep)
|
||||
|
||||
return acc
|
||||
}
|
||||
|
||||
// ???.### 1,1,3
|
||||
func ReadLine(line string) (string, []int) {
|
||||
firstSplit := strings.Split(line, " ")
|
||||
if len(firstSplit) != 2 {
|
||||
panic(fmt.Sprintf("error splitting %s into 2", line))
|
||||
}
|
||||
mask := firstSplit[0]
|
||||
mask = myRepeat(mask, "?", 5)
|
||||
blocks := firstSplit[1]
|
||||
blocks = myRepeat(blocks, ",", 5)
|
||||
// log.Printf(">> repeating blocks %s", blocks)
|
||||
blockLengthStrings := strings.Split(blocks, ",")
|
||||
blockLengs := make([]int, len(blockLengthStrings))
|
||||
|
||||
for i, blockLenStr := range blockLengthStrings {
|
||||
num, err := strconv.Atoi(blockLenStr)
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("error extracting num %s from %s\n", blockLenStr, line))
|
||||
}
|
||||
blockLengs[i] = num
|
||||
}
|
||||
|
||||
return mask, blockLengs
|
||||
}
|
||||
|
||||
func generatePermutations(curString string, targetLength int, blockLengths []int, blockLengthsSum int, mask string, memo map[string]int) int {
|
||||
|
||||
memoKey := fmt.Sprintf("%+v|%d", blockLengths, len(curString))
|
||||
memoized, memoFound := memo[memoKey]
|
||||
if memoFound {
|
||||
return memoized
|
||||
}
|
||||
|
||||
// fmt.Printf("> entering with \n%s\nfor map \n%s\n\n", curString, mask)
|
||||
// time.Sleep(time.Second)
|
||||
if !isVariantMatchesMask(curString, mask) {
|
||||
return 0
|
||||
}
|
||||
|
||||
// log.Printf("> entering with %s\n", curString)
|
||||
if len(blockLengths) == 0 {
|
||||
if len(curString) > targetLength {
|
||||
return 0
|
||||
}
|
||||
variant := curString + strings.Repeat(".", targetLength-len(curString))
|
||||
if !isVariantMatchesMask(variant, mask) {
|
||||
return 0
|
||||
}
|
||||
memo[memoKey] = 1
|
||||
return 1
|
||||
}
|
||||
|
||||
nextBlock := blockLengths[0]
|
||||
restBlocks := blockLengths[1:]
|
||||
|
||||
if len(curString) + blockLengthsSum + len(blockLengths) - 1 > targetLength {
|
||||
return 0
|
||||
}
|
||||
|
||||
isLast := len(restBlocks) == 0
|
||||
rightPointRepeat := 1
|
||||
if isLast {
|
||||
rightPointRepeat = 0
|
||||
}
|
||||
|
||||
whenPass := curString + "."
|
||||
whenAdd := curString + strings.Repeat("#", nextBlock) + strings.Repeat(".", rightPointRepeat)
|
||||
|
||||
passCount := generatePermutations(whenPass, targetLength, blockLengths, blockLengthsSum, mask, memo)
|
||||
addCount := generatePermutations(whenAdd, targetLength, restBlocks, blockLengthsSum-nextBlock, mask, memo)
|
||||
|
||||
memo[memoKey] = passCount + addCount
|
||||
return passCount + addCount
|
||||
}
|
||||
|
||||
func isVariantMatchesMask(variant, mask string) bool {
|
||||
if len(mask) < len(variant) {
|
||||
log.Printf("mask %s is less than variant %s\n", mask, variant)
|
||||
}
|
||||
maskRunes := []rune(mask)
|
||||
for i, symb := range variant {
|
||||
if maskRunes[i] == '?' {
|
||||
continue
|
||||
}
|
||||
if maskRunes[i] != symb {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
6
day12/example
Normal file
6
day12/example
Normal file
@@ -0,0 +1,6 @@
|
||||
#.#.### 1,1,3
|
||||
.#...#....###. 1,1,3
|
||||
.#.###.#.###### 1,3,1,6
|
||||
####.#...#... 4,1,1
|
||||
#....######..#####. 1,6,5
|
||||
.###.##....# 3,2,1
|
||||
6
day12/example1
Normal file
6
day12/example1
Normal file
@@ -0,0 +1,6 @@
|
||||
???.### 1,1,3
|
||||
.??..??...?##. 1,1,3
|
||||
?#?#?#?#?#?#?#? 1,3,1,6
|
||||
????.#...#... 4,1,1
|
||||
????.######..#####. 1,6,5
|
||||
?###???????? 3,2,1
|
||||
104
day12/notes.org
Normal file
104
day12/notes.org
Normal file
@@ -0,0 +1,104 @@
|
||||
#+title: Notes
|
||||
* i guess let's generate all possible? and short circuit when they are not matching mask
|
||||
how do i generate all possible?
|
||||
|
||||
i take length of the mask, that's max size
|
||||
then for each step, either put . or put n # from the input.
|
||||
|
||||
add to current string, and do 2 recursive calls, one with diminished 'queue', one with same
|
||||
* wrong answer on input
|
||||
it's too high
|
||||
|
||||
and log shows:
|
||||
|
||||
2023/12/12 15:07:52 for line ???#?.?#?#.?#???#..? blocks [4 4 5 1] matches 2
|
||||
|
||||
and it should be 0
|
||||
** huh, nope this looks good:
|
||||
testMask := "???#?.?#?#.?#???#..?"
|
||||
testBlocks := []int{4,4,5,1}
|
||||
testVariants := generatePermutations("", len(testMask), testBlocks, 14, testMask)
|
||||
fmt.Printf("for mask %s and blocks %+v\n", testMask, testBlocks)
|
||||
fmt.Println(testVariants)
|
||||
|
||||
for mask ???#?.?#?#.?#???#..? and blocks [4 4 5 1]
|
||||
[####..####..#####..# .####.####..#####..#]
|
||||
** let's check this : for line ??????#???????? blocks [7 2] matches 21
|
||||
** or this for line ?????.??#????? blocks [3 3 2 1] matches 3
|
||||
looks ok
|
||||
** this for line ??..??#?????#?##? blocks [1 1 1 1 4] matches 15
|
||||
looks good
|
||||
** for line ?#??#??#???.??.??.? blocks [1 2 3 1 1 1] matches 20
|
||||
seems ok
|
||||
** for line ???????#??.????####? blocks [1 1 1 1 1 6] matches 58
|
||||
bingo?
|
||||
|
||||
for mask ???????#??.????####? and blocks [1 1 1 1 1 6]
|
||||
#.#.#..#.#.######...
|
||||
#.#.#..#.#..######..
|
||||
#.#.#..#.#...######.
|
||||
#.#.#..#.#....######
|
||||
#.#.#..#...#.######.
|
||||
#.#.#..#...#..######
|
||||
#.#.#..#....#.######
|
||||
#.#..#.#.#.######...
|
||||
#.#..#.#.#..######..
|
||||
#.#..#.#.#...######.
|
||||
#.#..#.#.#....######
|
||||
#.#..#.#...#.######.
|
||||
#.#..#.#...#..######
|
||||
#.#..#.#....#.######
|
||||
#.#....#.#.#.######.
|
||||
#.#....#.#.#..######
|
||||
#.#....#.#..#.######
|
||||
#..#.#.#.#.######...
|
||||
#..#.#.#.#..######..
|
||||
#..#.#.#.#...######.
|
||||
#..#.#.#.#....######
|
||||
#..#.#.#...#.######.
|
||||
#..#.#.#...#..######
|
||||
#..#.#.#....#.######
|
||||
#..#...#.#.#.######.
|
||||
#..#...#.#.#..######
|
||||
#..#...#.#..#.######
|
||||
#...#..#.#.#.######.
|
||||
#...#..#.#.#..######
|
||||
#...#..#.#..#.######
|
||||
#....#.#.#.#.######.
|
||||
#....#.#.#.#..######
|
||||
#....#.#.#..#.######
|
||||
.#.#.#.#.#.######...
|
||||
.#.#.#.#.#..######..
|
||||
.#.#.#.#.#...######.
|
||||
.#.#.#.#.#....######
|
||||
.#.#.#.#...#.######.
|
||||
.#.#.#.#...#..######
|
||||
.#.#.#.#....#.######
|
||||
.#.#...#.#.#.######.
|
||||
.#.#...#.#.#..######
|
||||
.#.#...#.#..#.######
|
||||
.#..#..#.#.#.######.
|
||||
.#..#..#.#.#..######
|
||||
.#..#..#.#..#.######
|
||||
.#...#.#.#.#.######.
|
||||
.#...#.#.#.#..######
|
||||
.#...#.#.#..#.######
|
||||
..#.#..#.#.#.######.
|
||||
..#.#..#.#.#..######
|
||||
..#.#..#.#..#.######
|
||||
..#..#.#.#.#.######.
|
||||
..#..#.#.#.#..######
|
||||
..#..#.#.#..#.######
|
||||
...#.#.#.#.#.######.
|
||||
...#.#.#.#.#..######
|
||||
...#.#.#.#..#.######
|
||||
* well, maybe overnight will calculate.
|
||||
but i guess i needed to check whether blocks are 'always' taking full width
|
||||
then i'll only need to calculate once, and then multiply
|
||||
** for example
|
||||
2023/12/12 20:40:41 699 : for line ??#?????#???.? ???#?????#???.????#?????#???.????#?????#???.????#?????#???.? blocks [3 1 2 1 3 1 2 1 3 1 2 1 3 1 2 1 3 1 2 1] matches 38294856
|
||||
|
||||
??#?? ???#?? ?.?
|
||||
3,1,2,1 - 10+3 = 13
|
||||
|
||||
lowest s ###.#.##.#..... - plenty of space for additional
|
||||
226
day13/dayThirteen.go
Normal file
226
day13/dayThirteen.go
Normal file
@@ -0,0 +1,226 @@
|
||||
package day13
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
filename := "day13/input"
|
||||
fmt.Println("hello day 13.", filename)
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("error reading file %s", filename))
|
||||
}
|
||||
allText := string(bytes)
|
||||
fieldTexts := strings.Split(allText, "\n\n")
|
||||
|
||||
result := 0
|
||||
for _, fieldText := range fieldTexts {
|
||||
field := ReadField(fieldText)
|
||||
result += Calc(field)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func Calc(field Field) int {
|
||||
verticals, horizontals := field.initMirrors()
|
||||
fmt.Println(field.String())
|
||||
fmt.Printf("field width %d and height %d\n", len(field.Symbols[0]), len(field.Symbols))
|
||||
|
||||
for rowNum, row := range field.Symbols {
|
||||
for colNum, symb := range row {
|
||||
for _, horizontalMirrorUnderCheck := range horizontals {
|
||||
// if horizontalMirrorUnderCheck.Smaller != 4 {
|
||||
// continue
|
||||
// }
|
||||
mirroredRow, shouldCheck := horizontalMirrorUnderCheck.reflectCoord(rowNum)
|
||||
// log.Println("for mirror", horizontalMirrorUnderCheck.String())
|
||||
// log.Printf("> checking row %d and mirrored %d; should %t\n", rowNum, mirroredRow, shouldCheck)
|
||||
if shouldCheck {
|
||||
// log.Printf("checking horizontal mirror %+v", horizontalMirrorUnderCheck)
|
||||
// log.Printf("in should check for row %d, col %d, mirrored row %d\n", rowNum, colNum, mirroredRow)
|
||||
mirroredSymb := field.Symbols[mirroredRow][colNum]
|
||||
isMirrored := symb == mirroredSymb
|
||||
if !isMirrored {
|
||||
// log.Printf("found not mirrored : %s != %s\n", string(symb), string(mirroredSymb))
|
||||
horizontalMirrorUnderCheck.FailedLineChecks[rowNum] += 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// whole row got checked.
|
||||
// let's mark successful line check for all that didn't fail this line check
|
||||
for _, horizontalMirror := range horizontals {
|
||||
_, failedCheckReported := horizontalMirror.FailedLineChecks[rowNum]
|
||||
if !failedCheckReported {
|
||||
horizontalMirror.SuccessfulLineChecks[rowNum] = struct{}{}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
rowsAboveHorizontals := 0
|
||||
for _, mirr := range horizontals {
|
||||
fmt.Println("horizontal: ", mirr.String())
|
||||
if mirr.isFullMirror() {
|
||||
log.Printf(">> found perfect Horizontal %+v\n", mirr)
|
||||
rowsAboveHorizontals += (mirr.Smaller + 1)
|
||||
}
|
||||
}
|
||||
|
||||
for colNum, _ := range field.Symbols[0] {
|
||||
for rowNum, row := range field.Symbols {
|
||||
symb := row[colNum]
|
||||
for _, verticalMirrorUnderCheck := range verticals {
|
||||
// if verticalMirrorUnderCheck.Smaller != 8 {
|
||||
// continue
|
||||
// }
|
||||
mirroredCol, shouldCheck := verticalMirrorUnderCheck.reflectCoord(colNum)
|
||||
if shouldCheck {
|
||||
// log.Printf("checking vertical mirror %+v", verticalMirrorUnderCheck)
|
||||
// log.Printf("in should check for row %d, col %d, mirrored col %d\n", rowNum, colNum, mirroredCol)
|
||||
mirroredSymb := field.Symbols[rowNum][mirroredCol]
|
||||
isMirrored := symb == mirroredSymb
|
||||
if !isMirrored {
|
||||
// log.Printf("found not mirrored : %s != %s\n", string(symb), string(mirroredSymb))
|
||||
verticalMirrorUnderCheck.FailedLineChecks[colNum] += 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// whole row got checked.
|
||||
// let's mark successful line check for all that didn't fail this line check
|
||||
for _, verticalMirror := range verticals {
|
||||
_, failedCheckReported := verticalMirror.FailedLineChecks[colNum]
|
||||
if !failedCheckReported {
|
||||
verticalMirror.SuccessfulLineChecks[colNum] = struct{}{}
|
||||
}
|
||||
}
|
||||
}
|
||||
colsToLeftOfHorizontals := 0
|
||||
for _, mirr := range verticals {
|
||||
fmt.Println("vertical: ", mirr.String())
|
||||
if mirr.isFullMirror() {
|
||||
log.Printf(">> found perfect Vertical %+v\n", mirr)
|
||||
colsToLeftOfHorizontals += (mirr.Smaller + 1)
|
||||
}
|
||||
}
|
||||
|
||||
result := colsToLeftOfHorizontals + 100*rowsAboveHorizontals
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
type Field struct {
|
||||
Symbols [][]rune
|
||||
}
|
||||
|
||||
func ReadField(fieldText string) Field {
|
||||
fieldText = strings.TrimSpace(fieldText)
|
||||
lines := strings.Split(fieldText, "\n")
|
||||
|
||||
symbols := make([][]rune, len(lines))
|
||||
for i, line := range lines {
|
||||
symbols[i] = []rune(line)
|
||||
}
|
||||
|
||||
return Field{
|
||||
Symbols: symbols,
|
||||
}
|
||||
}
|
||||
|
||||
func (f *Field) String() string {
|
||||
text := "\n"
|
||||
for _, row := range f.Symbols {
|
||||
text += string(row)
|
||||
text += "\n"
|
||||
}
|
||||
return text
|
||||
}
|
||||
|
||||
func (f *Field) initMirrors() (vertical []Mirror, horizontal []Mirror) {
|
||||
height := len(f.Symbols)
|
||||
width := len(f.Symbols[0])
|
||||
amountHorizontal := height - 1
|
||||
amountVertical := width - 1
|
||||
horizontal = make([]Mirror, amountHorizontal)
|
||||
vertical = make([]Mirror, amountVertical)
|
||||
|
||||
for rowNum := 0; rowNum < amountHorizontal; rowNum++ {
|
||||
maxDist := min(rowNum, height - 1 - (rowNum+1))
|
||||
// log.Println("maxDist ", maxDist, "for rowNum ", rowNum)
|
||||
horizontal[rowNum] = Mirror{
|
||||
Smaller: rowNum,
|
||||
Bigger: rowNum + 1,
|
||||
SuccessfulLineChecks: make(map[int]any),
|
||||
FailedLineChecks: make(map[int]int),
|
||||
MaxDistToCheck: maxDist,
|
||||
}
|
||||
}
|
||||
|
||||
for colNum := 0; colNum < amountVertical; colNum++ {
|
||||
maxDist := min(colNum, width - 1 - (colNum+1))
|
||||
vertical[colNum] = Mirror{
|
||||
Smaller: colNum,
|
||||
Bigger: colNum + 1,
|
||||
SuccessfulLineChecks: make(map[int]any),
|
||||
FailedLineChecks: make(map[int]int),
|
||||
MaxDistToCheck: min(colNum, maxDist),
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
type Mirror struct {
|
||||
// located between lines
|
||||
Smaller, Bigger int
|
||||
MaxDistToCheck int // how many steps from mirrow have to be checked to confirm
|
||||
// i.e if mirror between 0 and 1 - only rows 0 & 1 have to be checked, row 2 is 'mirrored' outside of the field and 'ok'
|
||||
// value 0 means one step from 'mirror' so rows 0 and 1
|
||||
SuccessfulLineChecks map[int]any
|
||||
FailedLineChecks map[int]int // from line num, to amount of errors in that line
|
||||
}
|
||||
|
||||
func (m *Mirror)isFullMirror() bool {
|
||||
correctFailedLinesCount := len(m.FailedLineChecks) == 2
|
||||
if correctFailedLinesCount {
|
||||
for failedLine, failedSymbols := range m.FailedLineChecks {
|
||||
reflectedLine, _ := m.reflectCoord(failedLine)
|
||||
doublyReflected, _ := m.reflectCoord(reflectedLine)
|
||||
// log.Printf(">>>> checking failed line %d, reflected is %d; doubly %d. amount failed is %d\n", failedLine, reflectedLine, doublyReflected, failedSymbols)
|
||||
if failedSymbols == 1 && (doublyReflected == failedLine) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (m *Mirror)String() string {
|
||||
return fmt.Sprintf("Mirror (full %t) between %d and %d. successful lines: %+v ; failed lines: %+v. Max check dist: %d\n",
|
||||
m.isFullMirror(), m.Smaller, m.Bigger, m.SuccessfulLineChecks, m.FailedLineChecks, m.MaxDistToCheck)
|
||||
}
|
||||
|
||||
func (m *Mirror) reflectCoord(coord int) (reflected int, shouldCheck bool) {
|
||||
dist := m.Smaller - coord
|
||||
|
||||
// _, distConfirmed := m.SuccessfulLineChecks[dist]
|
||||
// if distConfirmed {
|
||||
// // log.Printf("> getting dist confirmed for coord %d ; dist %d\n", coord, dist)
|
||||
// return 0, false // either line already fully confirmed, or failed. no need for additional checks
|
||||
// }
|
||||
|
||||
reflected = m.Bigger + dist
|
||||
if dist < 0 {
|
||||
dist = coord - m.Bigger
|
||||
reflected = m.Smaller - dist
|
||||
}
|
||||
|
||||
shouldCheck = dist <= m.MaxDistToCheck
|
||||
|
||||
return reflected, shouldCheck
|
||||
}
|
||||
15
day13/example
Normal file
15
day13/example
Normal file
@@ -0,0 +1,15 @@
|
||||
#.##..##.
|
||||
..#.##.#.
|
||||
##......#
|
||||
##......#
|
||||
..#.##.#.
|
||||
..##..##.
|
||||
#.#.##.#.
|
||||
|
||||
#...##..#
|
||||
#....#..#
|
||||
..##..###
|
||||
#####.##.
|
||||
#####.##.
|
||||
..##..###
|
||||
#....#..#
|
||||
7
day13/example1
Normal file
7
day13/example1
Normal file
@@ -0,0 +1,7 @@
|
||||
#.##..##.
|
||||
..#.##.#.
|
||||
##......#
|
||||
##......#
|
||||
..#.##.#.
|
||||
..##..##.
|
||||
#.#.##.#.
|
||||
7
day13/example2
Normal file
7
day13/example2
Normal file
@@ -0,0 +1,7 @@
|
||||
#...##..#
|
||||
#....#..#
|
||||
..##..###
|
||||
#####.##.
|
||||
#####.##.
|
||||
..##..###
|
||||
#....#..#
|
||||
15
day13/example3
Normal file
15
day13/example3
Normal file
@@ -0,0 +1,15 @@
|
||||
.#..#..
|
||||
.##.###
|
||||
..####.
|
||||
##.##.#
|
||||
#.####.
|
||||
#.#.##.
|
||||
##.##.#
|
||||
..####.
|
||||
.##.###
|
||||
.#..#..
|
||||
##.....
|
||||
#.###.#
|
||||
##.....
|
||||
##.....
|
||||
#.###.#
|
||||
17
day13/example4
Normal file
17
day13/example4
Normal file
@@ -0,0 +1,17 @@
|
||||
...#...####...#..
|
||||
.....##.##.##....
|
||||
##....######....#
|
||||
..#.##.#..#.##...
|
||||
##.###.####.###.#
|
||||
..###...##...###.
|
||||
#####.##..##.####
|
||||
#######....######
|
||||
###...#.##.#...##
|
||||
....###.##.###...
|
||||
##.####.##.####.#
|
||||
..###...##...###.
|
||||
##.#.##....##.#.#
|
||||
##..#.#....#.#..#
|
||||
##.###.#..#.###.#
|
||||
###.#...##...#.##
|
||||
..####.####.####.
|
||||
72
day13/notes.org
Normal file
72
day13/notes.org
Normal file
@@ -0,0 +1,72 @@
|
||||
#+title: Notes
|
||||
* part 2 problems
|
||||
making example 3 from first field of my input
|
||||
|
||||
.#..#..
|
||||
.##.###
|
||||
..####.
|
||||
##.##.#
|
||||
#.####.
|
||||
#.#.##.
|
||||
##.##.#
|
||||
..####.
|
||||
.##.###
|
||||
.#..#..
|
||||
##.....
|
||||
#.###.#
|
||||
##.....
|
||||
##.....
|
||||
#.###.#
|
||||
* the mirror should be between 4 & 5
|
||||
but my output is
|
||||
horizontal: Mirror (full false) between 4 and 5. successful lines: map[0:{} 1:{} 2:{} 3:{} 4:{} 6:{} 7:{} 8:{} 9:{} 10:{} 11:{} 12:{} 13:{} 14:{}] ; failed lines: map[5:1]. Max check dist: 4
|
||||
|
||||
why is line 4 marked as successful?
|
||||
|
||||
** let's turn off verticals, and only look at checks for horizontal 4
|
||||
** why do i have 'row 4, mirrored 0'?
|
||||
because of 'should check false' i guess
|
||||
** now example 3 works, but some other still don't find the mirror
|
||||
* another example
|
||||
error should be on line 2
|
||||
|
||||
...#...####...#..
|
||||
.....##.##.##....
|
||||
##....######....#
|
||||
..#.##.#..#.##...
|
||||
##.###.####.###.#
|
||||
..###...##...###.
|
||||
#####.##..##.####
|
||||
#######....######
|
||||
###...#.##.#...##
|
||||
....###.##.###...
|
||||
##.####.##.####.#
|
||||
..###...##...###.
|
||||
##.#.##....##.#.#
|
||||
##..#.#....#.#..#
|
||||
##.###.#..#.###.#
|
||||
###.#...##...#.##
|
||||
..####.####.####.
|
||||
** deleting around (8,9)
|
||||
.....
|
||||
.....
|
||||
##..#
|
||||
..#..
|
||||
##..#
|
||||
..##.
|
||||
#####
|
||||
#####
|
||||
#####
|
||||
.....
|
||||
##..#
|
||||
..##.
|
||||
##..#
|
||||
##..#
|
||||
##..#
|
||||
#####
|
||||
..##.
|
||||
|
||||
error should be (2, 3)
|
||||
** let's only keep vertical, with Smaller 8
|
||||
oh, there should be 'line 3, 1 error', but there also should be 'line 2, 1 error'
|
||||
why don't we have this?
|
||||
216
day14/dayFourteen.go
Normal file
216
day14/dayFourteen.go
Normal file
@@ -0,0 +1,216 @@
|
||||
package day14
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello day 14")
|
||||
|
||||
field := ReadPlatform("day14/input")
|
||||
fmt.Println(field.String())
|
||||
|
||||
// fmt.Printf("> lines for field %+v\n", field.UpIndices())
|
||||
// field.Move(field.Height(), field.UpIndices())
|
||||
cycles := 1000000000
|
||||
states := make(map[string]int)
|
||||
// 2023/12/14 11:50:32 >>> found loop. known state after 10 equal to one after 3
|
||||
|
||||
var loopLen, initialStretch int
|
||||
|
||||
for i := 1; i <= cycles; i++ {
|
||||
field.DoSpinCycle()
|
||||
// fmt.Println(field.String())
|
||||
|
||||
stringRepr := field.String()
|
||||
prevIter, known := states[stringRepr]
|
||||
if known {
|
||||
log.Printf(">>> found loop. known state after %d equal to one after %d", i, prevIter)
|
||||
initialStretch = prevIter
|
||||
loopLen = i - prevIter
|
||||
break
|
||||
}
|
||||
|
||||
states[stringRepr] = i
|
||||
|
||||
if i % 100000 == 0 {
|
||||
log.Print("done ", i, " cycles")
|
||||
}
|
||||
}
|
||||
|
||||
// field is already in a 'loop' state.
|
||||
// so we've already done 'initial stretch' so to make field in same state as after 'cycles'
|
||||
// i only need to check rest of (cycles - initialStretch)
|
||||
movesToMake := (cycles - initialStretch)%loopLen
|
||||
log.Printf(">>> data: initial steps %d, loop len %d. to do same as %d iterations i need %d", initialStretch, loopLen, cycles, movesToMake)
|
||||
|
||||
for i := 1; i <= movesToMake; i++ {
|
||||
field.DoSpinCycle()
|
||||
// fmt.Println(field.String())
|
||||
}
|
||||
|
||||
// north rock load
|
||||
return field.NorthLoad()
|
||||
}
|
||||
|
||||
const Rock rune = 'O'
|
||||
const Wall rune = '#'
|
||||
const Space rune = '.'
|
||||
|
||||
type Platform struct {
|
||||
Rocks [][]rune
|
||||
}
|
||||
|
||||
func (p *Platform) Height() int {
|
||||
return len(p.Rocks)
|
||||
}
|
||||
func (p *Platform) Width() int {
|
||||
return len(p.Rocks[0])
|
||||
}
|
||||
|
||||
func ReadPlatform(filename string) Platform {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("cannot read file: ", filename))
|
||||
}
|
||||
text := string(bytes)
|
||||
text = strings.TrimSpace(text)
|
||||
lines := strings.Split(text, "\n")
|
||||
|
||||
rocks := make([][]rune, len(lines))
|
||||
for i, line := range lines {
|
||||
rocks[i] = []rune(line)
|
||||
}
|
||||
|
||||
return Platform{
|
||||
Rocks: rocks,
|
||||
}
|
||||
}
|
||||
|
||||
func (p *Platform) String() string {
|
||||
text := "\n"
|
||||
for _, row := range p.Rocks {
|
||||
text += string(row)
|
||||
text += "\n"
|
||||
}
|
||||
|
||||
return text
|
||||
}
|
||||
|
||||
type Coord struct{ Row, Col int }
|
||||
|
||||
// indices for moving UP, from down to up
|
||||
func (p *Platform) UpIndices() [][]Coord {
|
||||
lines := make([][]Coord, 0)
|
||||
for col := 0; col < p.Width(); col++ {
|
||||
line := make([]Coord, 0)
|
||||
for row := 0; row < p.Height(); row++ {
|
||||
line = append(line, Coord{Row: row, Col: col})
|
||||
}
|
||||
lines = append(lines, line)
|
||||
}
|
||||
return lines
|
||||
}
|
||||
|
||||
// indices for moving DOWN, from up to down
|
||||
func (p *Platform) DownIndices() [][]Coord {
|
||||
lines := make([][]Coord, 0)
|
||||
for col := 0; col < p.Width(); col++ {
|
||||
line := make([]Coord, 0)
|
||||
for row := p.Height() - 1; row >= 0; row-- {
|
||||
line = append(line, Coord{Row: row, Col: col})
|
||||
}
|
||||
lines = append(lines, line)
|
||||
}
|
||||
return lines
|
||||
}
|
||||
|
||||
// indices for moving RIGHT from right to left
|
||||
func (p *Platform) RightIndices() [][]Coord {
|
||||
lines := make([][]Coord, 0)
|
||||
for row := 0; row < p.Height(); row++ {
|
||||
line := make([]Coord, 0)
|
||||
for col := p.Width() - 1; col >= 0; col-- {
|
||||
line = append(line, Coord{Row: row, Col: col})
|
||||
}
|
||||
lines = append(lines, line)
|
||||
}
|
||||
return lines
|
||||
}
|
||||
|
||||
// indices for moving LEFT, from left to right
|
||||
func (p *Platform) LeftIndices() [][]Coord {
|
||||
lines := make([][]Coord, 0)
|
||||
for row := 0; row < p.Height(); row++ {
|
||||
line := make([]Coord, 0)
|
||||
for col := 0; col < p.Width(); col++ {
|
||||
line = append(line, Coord{Row: row, Col: col})
|
||||
}
|
||||
lines = append(lines, line)
|
||||
}
|
||||
return lines
|
||||
}
|
||||
|
||||
func (p *Platform) SymbAt(coord Coord) rune {
|
||||
return p.Rocks[coord.Row][coord.Col]
|
||||
}
|
||||
func (p *Platform) SetSymbAt(coord Coord, symb rune) {
|
||||
p.Rocks[coord.Row][coord.Col] = symb
|
||||
}
|
||||
|
||||
func (p *Platform) Move(n int, lines [][]Coord) {
|
||||
for _, line := range lines {
|
||||
moveSize := 0
|
||||
for i, coord := range line {
|
||||
symb := p.SymbAt(coord)
|
||||
switch symb {
|
||||
case Space:
|
||||
moveSize += 1
|
||||
if moveSize > n {
|
||||
moveSize = n
|
||||
}
|
||||
case Wall:
|
||||
moveSize = 0
|
||||
case Rock:
|
||||
if moveSize == 0 {
|
||||
continue
|
||||
}
|
||||
// get coord for moveSize back. and set that to 'o'
|
||||
// and set current to '.'
|
||||
// panic if that place is not '.' i guess
|
||||
moveTo := line[i-moveSize]
|
||||
symbAtTarget := p.SymbAt(moveTo)
|
||||
if symbAtTarget != Space {
|
||||
panic(fmt.Sprintf("attempting to move %+v to %+v, target symbol is %s, not '.'",
|
||||
coord, moveTo, string(symbAtTarget)))
|
||||
}
|
||||
p.SetSymbAt(moveTo, Rock)
|
||||
p.SetSymbAt(coord, Space)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (p *Platform) NorthLoad() int {
|
||||
total := 0
|
||||
height := p.Height()
|
||||
for i, row := range p.Rocks {
|
||||
for _, symb := range row {
|
||||
if symb == Rock {
|
||||
total += (height - i)
|
||||
}
|
||||
}
|
||||
}
|
||||
return total
|
||||
}
|
||||
|
||||
func (p *Platform) DoSpinCycle() {
|
||||
// north, west, south, east - till the end
|
||||
p.Move(p.Height(), p.UpIndices())
|
||||
p.Move(p.Width(), p.LeftIndices())
|
||||
p.Move(p.Height(), p.DownIndices())
|
||||
p.Move(p.Width(), p.RightIndices())
|
||||
}
|
||||
10
day14/example
Normal file
10
day14/example
Normal file
@@ -0,0 +1,10 @@
|
||||
O....#....
|
||||
O.OO#....#
|
||||
.....##...
|
||||
OO.#O....O
|
||||
.O.....O#.
|
||||
O.#..O.#.#
|
||||
..O..#O..O
|
||||
.......O..
|
||||
#....###..
|
||||
#OO..#....
|
||||
148
day15/dayFifteen.go
Normal file
148
day15/dayFifteen.go
Normal file
@@ -0,0 +1,148 @@
|
||||
package day15
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"regexp"
|
||||
"slices"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello day 15")
|
||||
log.Println("hello day 15")
|
||||
filename := "day15/input"
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading file ", filename))
|
||||
}
|
||||
text := string(bytes)
|
||||
text = strings.TrimSpace(text)
|
||||
instructions := strings.Split(text, ",")
|
||||
|
||||
result := 0
|
||||
|
||||
boxes := make([]Box, 256)
|
||||
for i, box := range boxes {
|
||||
box.Focals = make(map[string]int)
|
||||
boxes[i] = box
|
||||
}
|
||||
|
||||
for _, instructionStr := range instructions {
|
||||
i := ReadInstruction(instructionStr)
|
||||
box := boxes[i.Box]
|
||||
box.Act(i)
|
||||
boxes[i.Box] = box
|
||||
|
||||
// result += ASCIIStringHash(instruction)
|
||||
}
|
||||
|
||||
for i, box := range boxes {
|
||||
if len(box.Labels) != 0 {
|
||||
log.Printf("%d box %+v final state\n", i, box)
|
||||
}
|
||||
result += (i + 1) * box.FocusingPower()
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
type Box struct {
|
||||
Labels []string
|
||||
Focals map[string]int
|
||||
}
|
||||
|
||||
func (b *Box)Act(i Instruction) {
|
||||
log.Printf("for box %+v instruction \n%s\n", b, i.String())
|
||||
switch i.Action {
|
||||
case Put:
|
||||
_, found := b.Focals[i.Label]
|
||||
if !found {
|
||||
b.Labels = append(b.Labels, i.Label)
|
||||
}
|
||||
b.Focals[i.Label] = i.LensFocal
|
||||
|
||||
case Remove:
|
||||
_, found := b.Focals[i.Label]
|
||||
if !found {
|
||||
return
|
||||
}
|
||||
index := slices.Index(b.Labels, i.Label)
|
||||
delete(b.Focals, i.Label)
|
||||
b.Labels = slices.Delete(b.Labels, index, index+1)
|
||||
}
|
||||
log.Printf("result : %+v\n", b)
|
||||
return
|
||||
}
|
||||
|
||||
func (b *Box)FocusingPower() int {
|
||||
result := 0
|
||||
for i, label := range b.Labels {
|
||||
result += (i + 1) * b.Focals[label]
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
type Action rune
|
||||
|
||||
const (
|
||||
Put Action = '='
|
||||
Remove = '-'
|
||||
)
|
||||
|
||||
type Instruction struct {
|
||||
Label string
|
||||
Box int
|
||||
Action Action
|
||||
LensFocal int
|
||||
}
|
||||
|
||||
func (i *Instruction) String() string {
|
||||
operation := ""
|
||||
switch i.Action {
|
||||
case Put:
|
||||
operation = "put into"
|
||||
case Remove:
|
||||
operation = "remove from"
|
||||
}
|
||||
return fmt.Sprintf("%s\t\t%d of focal %d %s", operation, i.Box, i.LensFocal, i.Label)
|
||||
}
|
||||
|
||||
func ReadInstruction(str string) Instruction {
|
||||
result := Instruction{}
|
||||
|
||||
re := regexp.MustCompile(`(?P<label>\D+)(?P<operation>[=\-])(?P<focal>\d*)`)
|
||||
// log.Println("in str ", str)
|
||||
fields := re.FindStringSubmatch(str)
|
||||
// log.Printf("in %s found %+v", str, fields)
|
||||
|
||||
operation := fields[2]
|
||||
operationRune := []rune(operation)[0]
|
||||
result.Action = Action(operationRune)
|
||||
|
||||
if operationRune == '=' {
|
||||
focalStr := fields[3]
|
||||
focal, err := strconv.Atoi(focalStr)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading focal from ", str))
|
||||
}
|
||||
result.LensFocal = focal
|
||||
}
|
||||
|
||||
result.Label = fields[1]
|
||||
result.Box = ASCIIStringHash(result.Label)
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func ASCIIStringHash(str string) int {
|
||||
result := 0
|
||||
for _, symb := range str {
|
||||
result += int(symb)
|
||||
result *= 17
|
||||
result %= 256
|
||||
}
|
||||
return result
|
||||
}
|
||||
1
day15/example
Normal file
1
day15/example
Normal file
@@ -0,0 +1 @@
|
||||
rn=1,cm-,qp=3,cm=2,qp-,pc=4,ot=9,ab=5,pc-,pc=6,ot=7
|
||||
10
day16/example
Normal file
10
day16/example
Normal file
@@ -0,0 +1,10 @@
|
||||
.|...\....
|
||||
|.-.\.....
|
||||
.....|-...
|
||||
........|.
|
||||
..........
|
||||
.........\
|
||||
..../.\\..
|
||||
.-.-/..|..
|
||||
.|....-|.\
|
||||
..//.|....
|
||||
306
day16/floorWillBeLava.go
Normal file
306
day16/floorWillBeLava.go
Normal file
@@ -0,0 +1,306 @@
|
||||
package day16
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"math"
|
||||
"os"
|
||||
"strings"
|
||||
"sync"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello from day 16")
|
||||
log.Println("starting")
|
||||
filename := "day16/input"
|
||||
field := ReadField(filename)
|
||||
startPoints := StartPoints(&field)
|
||||
|
||||
var startPointsWaitGroup sync.WaitGroup
|
||||
startPointsWaitGroup.Add(len(startPoints))
|
||||
|
||||
results := make(chan int)
|
||||
|
||||
go func() {
|
||||
startPointsWaitGroup.Wait()
|
||||
close(results)
|
||||
}()
|
||||
|
||||
for _, start := range startPoints {
|
||||
go func(start MovementPoint) {
|
||||
cleanField := ReadField(filename)
|
||||
cleanField.StartTraversal(start)
|
||||
thisResult := cleanField.CountEnergized()
|
||||
results <- thisResult
|
||||
startPointsWaitGroup.Done()
|
||||
}(start)
|
||||
}
|
||||
|
||||
max := math.MinInt
|
||||
for energized := range results {
|
||||
if energized > max {
|
||||
max = energized
|
||||
log.Println("found new max: ", max)
|
||||
}
|
||||
}
|
||||
|
||||
// fmt.Println(field.String())
|
||||
// field.StartTraversal()
|
||||
|
||||
// fmt.Println(field.ShowEnergyzed())
|
||||
|
||||
return max
|
||||
}
|
||||
|
||||
func StartPoints(f *Field) []MovementPoint {
|
||||
result := make([]MovementPoint, 0)
|
||||
|
||||
for rowNum, row := range f.cells {
|
||||
result = append(result,
|
||||
MovementPoint{Row: rowNum, Col: 0, Direction: Rightward},
|
||||
MovementPoint{Row: rowNum, Col: len(row) - 1, Direction: Leftward})
|
||||
}
|
||||
|
||||
for colNum, _ := range f.cells[0] {
|
||||
result = append(result,
|
||||
MovementPoint{Row: 0, Col: colNum, Direction: Downward},
|
||||
MovementPoint{Row: len(f.cells) - 1, Col: colNum, Direction: Upward})
|
||||
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// have shared field
|
||||
// running traversal recursive function per ray
|
||||
// exit if going out of field or visiting cell that already had light in this direction
|
||||
// (i.e encountering loop)
|
||||
|
||||
type CellType rune
|
||||
|
||||
const (
|
||||
Empty CellType = '.'
|
||||
SplitterNS = '|'
|
||||
SplitterEW = '-'
|
||||
MirrorBackslash = '\\'
|
||||
MirrorSlash = '/'
|
||||
)
|
||||
|
||||
type Direction int
|
||||
|
||||
const (
|
||||
Upward Direction = iota
|
||||
Downward
|
||||
Leftward
|
||||
Rightward
|
||||
)
|
||||
|
||||
type Cell struct {
|
||||
CellType CellType
|
||||
KnownBeams map[Direction]any
|
||||
}
|
||||
|
||||
type Field struct {
|
||||
cells [][]*Cell
|
||||
}
|
||||
|
||||
func (f *Field) isValid(mp MovementPoint) bool {
|
||||
if mp.Row < 0 || mp.Col < 0 {
|
||||
return false
|
||||
}
|
||||
if mp.Row >= len(f.cells) || len(f.cells) == 0 || mp.Col >= len(f.cells[0]) {
|
||||
return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
func ReadField(filename string) Field {
|
||||
result := Field{}
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("cannot read file: ", filename))
|
||||
}
|
||||
text := string(bytes)
|
||||
text = strings.TrimSpace(text)
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
rowCells := make([]*Cell, 0)
|
||||
for _, symb := range line {
|
||||
rowCells = append(rowCells, &Cell{
|
||||
CellType: CellType(symb),
|
||||
KnownBeams: make(map[Direction]any),
|
||||
})
|
||||
}
|
||||
result.cells = append(result.cells, rowCells)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func (f *Field) String() string {
|
||||
result := "\n"
|
||||
for _, row := range f.cells {
|
||||
for _, cell := range row {
|
||||
result += string(cell.CellType)
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func (f *Field) ShowEnergyzed() string {
|
||||
result := "\n"
|
||||
for _, row := range f.cells {
|
||||
for _, cell := range row {
|
||||
if len(cell.KnownBeams) > 0 {
|
||||
result += "#"
|
||||
} else {
|
||||
result += string(cell.CellType)
|
||||
}
|
||||
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return result
|
||||
|
||||
}
|
||||
|
||||
type MovementPoint struct {
|
||||
Row, Col int
|
||||
Direction Direction
|
||||
}
|
||||
|
||||
func (f *Field) StartTraversal(startPoint MovementPoint) {
|
||||
reportedVisits := make(chan MovementPoint)
|
||||
var wg sync.WaitGroup
|
||||
|
||||
go f.RecordVisits(reportedVisits)
|
||||
wg.Add(1)
|
||||
go f.TraverseFrom(startPoint, reportedVisits, &wg)
|
||||
|
||||
wg.Wait()
|
||||
close(reportedVisits)
|
||||
}
|
||||
|
||||
func (f *Field) CountEnergized() (result int) {
|
||||
for _, row := range f.cells {
|
||||
for _, cell := range row {
|
||||
if len(cell.KnownBeams) > 0 {
|
||||
result += 1
|
||||
}
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (f *Field) RecordVisits(reportedPoints <-chan MovementPoint) {
|
||||
for point := range reportedPoints {
|
||||
cell := f.cells[point.Row][point.Col]
|
||||
// log.Printf("recording visit %+v to %+v at row %d col %d\n", point, cell, point.Row, point.Col)
|
||||
cell.KnownBeams[point.Direction] = struct{}{}
|
||||
}
|
||||
}
|
||||
|
||||
// starting at point, mark as visited
|
||||
// move (concurrently if required) into next points
|
||||
// ends - when out of the field, or if encountering a cycle
|
||||
func (f *Field) TraverseFrom(current MovementPoint, reportVisits chan<- MovementPoint, wg *sync.WaitGroup) {
|
||||
// log.Printf("> starting traverse through %+v", current)
|
||||
if !f.isValid(current) {
|
||||
log.Println("invalid current ", current, " should be impossible")
|
||||
wg.Done()
|
||||
return
|
||||
}
|
||||
cell := f.cells[current.Row][current.Col]
|
||||
_, knownDirection := cell.KnownBeams[current.Direction]
|
||||
if knownDirection {
|
||||
// log.Printf("found cycle at %+v in %+v", current, cell)
|
||||
wg.Done()
|
||||
return
|
||||
}
|
||||
|
||||
reportVisits <- current
|
||||
|
||||
nextPoints := NextPoints(f, current)
|
||||
// log.Printf("for current %+v next are: %+v\n", current, nextPoints)
|
||||
switch len(nextPoints) {
|
||||
case 0:
|
||||
wg.Done()
|
||||
return
|
||||
case 1:
|
||||
f.TraverseFrom(nextPoints[0], reportVisits, wg)
|
||||
return
|
||||
case 2:
|
||||
wg.Add(1)
|
||||
go f.TraverseFrom(nextPoints[0], reportVisits, wg)
|
||||
f.TraverseFrom(nextPoints[1], reportVisits, wg)
|
||||
return
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func NextPoints(f *Field, current MovementPoint) []MovementPoint {
|
||||
cell := f.cells[current.Row][current.Col]
|
||||
nextDirections := cell.CellType.NextDirections(current.Direction)
|
||||
nextCells := make([]MovementPoint, 0)
|
||||
for _, direction := range nextDirections {
|
||||
nextMovementPoint := current.ApplyDirection(direction)
|
||||
if f.isValid(nextMovementPoint) {
|
||||
nextCells = append(nextCells, nextMovementPoint)
|
||||
}
|
||||
}
|
||||
return nextCells
|
||||
}
|
||||
|
||||
// value receiver, can safely modify incoming mp
|
||||
// doesn't know about Field dimentions
|
||||
func (mp MovementPoint) ApplyDirection(d Direction) MovementPoint {
|
||||
switch d {
|
||||
case Upward:
|
||||
mp.Row -= 1
|
||||
case Downward:
|
||||
mp.Row += 1
|
||||
case Leftward:
|
||||
mp.Col -= 1
|
||||
case Rightward:
|
||||
mp.Col += 1
|
||||
}
|
||||
mp.Direction = d
|
||||
return mp
|
||||
}
|
||||
|
||||
func (ct CellType) NextDirections(currentDirection Direction) (nextDirections []Direction) {
|
||||
switch ct {
|
||||
case Empty:
|
||||
nextDirections = []Direction{currentDirection}
|
||||
case SplitterNS:
|
||||
if currentDirection == Rightward || currentDirection == Leftward {
|
||||
nextDirections = []Direction{Upward, Downward}
|
||||
} else {
|
||||
nextDirections = []Direction{currentDirection}
|
||||
}
|
||||
case SplitterEW:
|
||||
if currentDirection == Downward || currentDirection == Upward {
|
||||
nextDirections = []Direction{Leftward, Rightward}
|
||||
} else {
|
||||
nextDirections = []Direction{currentDirection}
|
||||
}
|
||||
case MirrorBackslash:
|
||||
// mirror symbol is \
|
||||
directionMappings := map[Direction]Direction{
|
||||
Leftward: Upward,
|
||||
Rightward: Downward,
|
||||
Upward: Leftward,
|
||||
Downward: Rightward,
|
||||
}
|
||||
nextDirections = []Direction{directionMappings[currentDirection]}
|
||||
case MirrorSlash:
|
||||
// mirrow symbol is /
|
||||
directionMappings := map[Direction]Direction{
|
||||
Leftward: Downward,
|
||||
Rightward: Upward,
|
||||
Upward: Rightward,
|
||||
Downward: Leftward,
|
||||
}
|
||||
nextDirections = []Direction{directionMappings[currentDirection]}
|
||||
}
|
||||
return
|
||||
}
|
||||
321
day17/clumsyCrucible.go
Normal file
321
day17/clumsyCrucible.go
Normal file
@@ -0,0 +1,321 @@
|
||||
package day17
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"math"
|
||||
"os"
|
||||
"slices"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello from day 17")
|
||||
filename := "day17/input"
|
||||
field := NewField(filename)
|
||||
log.Printf("%+v\n", field)
|
||||
|
||||
field.RunDijkstra()
|
||||
|
||||
lenToEnd := field.Paths[field.Finish].totalLength
|
||||
fmt.Println("check visually:")
|
||||
// fmt.Println(field.Paths[end].stringPathSoFar)
|
||||
fmt.Println(field.Paths[field.Finish].stringPathSoFar)
|
||||
return lenToEnd
|
||||
}
|
||||
|
||||
// let's do dijkstra. it also needs a priority queue
|
||||
|
||||
// priority queue would be over vertice. and would have to have enough information to
|
||||
// calc the distance from neighbors.
|
||||
// how to check condition of max 3 in one row?
|
||||
// with each vertice store [horizontal:n|vertical:n] and if it's 3 just dont consider?
|
||||
|
||||
// so in iteration, i have some vertice, with horizontal:2 for example,
|
||||
// i check all neighbors, if path through 'this' is shorter, set that as path,
|
||||
// but also mark the path with len of straight.
|
||||
//
|
||||
// so priority queue is with 'path to next'
|
||||
// or rather 'path to i,j'
|
||||
// then check for neighbors (non finished), calc distance to them through this
|
||||
// checking neighbors via 'path get directions' 'path get geighbors from directions'
|
||||
// if shorter - update
|
||||
// mark current as 'finished'
|
||||
// so, i'll be checking cost to enter directly from this table,
|
||||
// but check path len
|
||||
|
||||
func ReadEnterCosts(filename string) [][]int {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading file ", filename))
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
result := make([][]int, 0)
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
numbers := make([]int, 0)
|
||||
for _, digit := range line {
|
||||
num := int(digit - '0')
|
||||
numbers = append(numbers, num)
|
||||
}
|
||||
result = append(result, numbers)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
type Coord struct {
|
||||
Row, Col int
|
||||
}
|
||||
|
||||
func (c Coord) applyDirection(d Direction) (result Coord) {
|
||||
result = c
|
||||
switch d {
|
||||
case Upward:
|
||||
result.Row -= 1
|
||||
case Downward:
|
||||
result.Row += 1
|
||||
case Leftward:
|
||||
result.Col -= 1
|
||||
case Rightward:
|
||||
result.Col += 1
|
||||
}
|
||||
return
|
||||
}
|
||||
func (c Coord)String() string {
|
||||
return fmt.Sprintf("(%d,%d)", c.Row, c.Col)
|
||||
}
|
||||
|
||||
type Direction int
|
||||
|
||||
const (
|
||||
Upward Direction = iota
|
||||
Downward
|
||||
Leftward
|
||||
Rightward
|
||||
)
|
||||
|
||||
func (d Direction) String() string {
|
||||
strings := []string{"Up", "Down", "Left", "Right"}
|
||||
return strings[d]
|
||||
}
|
||||
|
||||
func (d Direction) AsSymbol() string {
|
||||
strings := []string{"^", "v", "<", ">"}
|
||||
return strings[d]
|
||||
}
|
||||
|
||||
func (d Direction) GetPerpendicular() (directions []Direction) {
|
||||
switch d {
|
||||
case Upward:
|
||||
directions = []Direction{Leftward, Rightward}
|
||||
case Downward:
|
||||
directions = []Direction{Leftward, Rightward}
|
||||
case Leftward:
|
||||
directions = []Direction{Upward, Downward}
|
||||
case Rightward:
|
||||
directions = []Direction{Upward, Downward}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
type PathSegmentEnd struct {
|
||||
endsAt Coord
|
||||
totalLength int
|
||||
lastSteps map[Direction]int
|
||||
lastDirection Direction
|
||||
stringPathSoFar string
|
||||
done bool
|
||||
}
|
||||
|
||||
func (p *PathSegmentEnd) NextDirections2() (next []Direction) {
|
||||
// last steps of 2 is max allowed 3 tiles in row
|
||||
lastSteps := p.lastSteps[p.lastDirection]
|
||||
|
||||
if lastSteps < 4 {
|
||||
return []Direction{p.lastDirection}
|
||||
}
|
||||
|
||||
next = append(next, p.lastDirection.GetPerpendicular()...)
|
||||
|
||||
if lastSteps < 10 {
|
||||
next = append(next, p.lastDirection)
|
||||
}
|
||||
|
||||
// log.Printf("getting directions from %+v they are %+v", p, next)
|
||||
return
|
||||
}
|
||||
|
||||
func (p *PathSegmentEnd) NextDirections() (next []Direction) {
|
||||
next = append(next, p.lastDirection.GetPerpendicular()...)
|
||||
|
||||
// last steps of 2 is max allowed 3 tiles in row
|
||||
lastSteps := p.lastSteps[p.lastDirection]
|
||||
if lastSteps < 3 {
|
||||
next = append(next, p.lastDirection)
|
||||
}
|
||||
|
||||
// log.Printf("getting directions from %+v they are %+v", p, next)
|
||||
return
|
||||
}
|
||||
|
||||
type Field struct {
|
||||
Paths map[Coord]*PathSegmentEnd
|
||||
Costs [][]int
|
||||
Height, Width int
|
||||
Start Coord
|
||||
Finish Coord
|
||||
}
|
||||
|
||||
func NewField(filename string) Field {
|
||||
enterCosts := ReadEnterCosts(filename)
|
||||
startSegment := PathSegmentEnd{
|
||||
endsAt: Coord{0, 0},
|
||||
totalLength: 0,
|
||||
lastSteps: make(map[Direction]int),
|
||||
done: true,
|
||||
lastDirection: Downward, // fake, need to init direct neighbors also
|
||||
}
|
||||
initialPaths := make(map[Coord]*PathSegmentEnd)
|
||||
initialPaths[Coord{0, 0}] = &startSegment
|
||||
height := len(enterCosts)
|
||||
width := len(enterCosts[0])
|
||||
|
||||
return Field{
|
||||
Paths: initialPaths,
|
||||
Costs: enterCosts,
|
||||
Height: height,
|
||||
Width: width,
|
||||
Start: Coord{0, 0},
|
||||
Finish: Coord{height - 1, width - 1},
|
||||
}
|
||||
}
|
||||
|
||||
func (f *Field) isValid(c Coord) bool {
|
||||
return c.Col >= 0 && c.Row >= 0 && c.Row < f.Height && c.Col < f.Width
|
||||
}
|
||||
|
||||
// presupposes that direction is valid
|
||||
func (f *Field) continuePathInDirection(curPath PathSegmentEnd, d Direction) (result PathSegmentEnd) {
|
||||
// curPath := f.Paths[from]
|
||||
from := curPath.endsAt
|
||||
nextCoord := from.applyDirection(d)
|
||||
moveCost := f.Costs[nextCoord.Row][nextCoord.Col]
|
||||
newCost := curPath.totalLength + moveCost
|
||||
lastSteps := make(map[Direction]int)
|
||||
|
||||
curPathStepsIntoThisDirection, found := curPath.lastSteps[d]
|
||||
if !found {
|
||||
lastSteps[d] = 1
|
||||
} else {
|
||||
lastSteps[d] = curPathStepsIntoThisDirection + 1
|
||||
}
|
||||
|
||||
return PathSegmentEnd{
|
||||
endsAt: nextCoord,
|
||||
totalLength: newCost,
|
||||
lastDirection: d,
|
||||
lastSteps: lastSteps,
|
||||
stringPathSoFar: curPath.stringPathSoFar + d.AsSymbol(),
|
||||
}
|
||||
}
|
||||
|
||||
func (p *PathSegmentEnd)StringKey() string {
|
||||
return fmt.Sprintf("%s from %s with len %+v", p.endsAt.String(), p.lastDirection, p.lastSteps)
|
||||
}
|
||||
|
||||
func (f *Field) RunDijkstra() {
|
||||
checking := make([]PathSegmentEnd, 0)
|
||||
distancesMap := make(map[string]int, 0)
|
||||
|
||||
startingPath := f.Paths[f.Start]
|
||||
anotherStartingPath := PathSegmentEnd{
|
||||
endsAt: Coord{0, 0},
|
||||
totalLength: 0,
|
||||
lastSteps: make(map[Direction]int),
|
||||
done: true,
|
||||
lastDirection: Rightward, // fake, need to init direct neighbors also
|
||||
stringPathSoFar: ".",
|
||||
}
|
||||
|
||||
checking = append(checking, *startingPath, anotherStartingPath)
|
||||
|
||||
distancesMap[startingPath.StringKey()] = 0
|
||||
distancesMap[anotherStartingPath.StringKey()] = 0
|
||||
|
||||
for len(checking) > 0 {
|
||||
var currentPath PathSegmentEnd
|
||||
selectingMinDistanceOfVisited := math.MaxInt
|
||||
for _, path := range checking {
|
||||
if path.totalLength < selectingMinDistanceOfVisited {
|
||||
currentPath = path
|
||||
selectingMinDistanceOfVisited = path.totalLength
|
||||
}
|
||||
}
|
||||
currentCoord := currentPath.endsAt
|
||||
directions := currentPath.NextDirections2()
|
||||
// fmt.Printf("> one more iteration for %+v ; directions will check %+v\n", currentPath, directions)
|
||||
|
||||
for _, direction := range directions {
|
||||
neighborCoord := currentCoord.applyDirection(direction)
|
||||
if !f.isValid(neighborCoord) {
|
||||
continue // prevent going off the grid
|
||||
}
|
||||
// fmt.Printf("from %+v will examine in direction %s to %+v %+v\n", currentCoord, direction, neighborCoord, currentPath)
|
||||
neighborPathSoFar, found := f.Paths[neighborCoord]
|
||||
if !found {
|
||||
neighborPathSoFar = &PathSegmentEnd{
|
||||
totalLength: math.MaxInt,
|
||||
}
|
||||
f.Paths[neighborCoord] = neighborPathSoFar
|
||||
}
|
||||
|
||||
pathIfWeGoFromCurrent := f.continuePathInDirection(currentPath, direction)
|
||||
if pathIfWeGoFromCurrent.endsAt == f.Finish {
|
||||
if pathIfWeGoFromCurrent.lastSteps[pathIfWeGoFromCurrent.lastDirection] < 4 {
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
distFromThatSide, isKnown := distancesMap[pathIfWeGoFromCurrent.StringKey()]
|
||||
if !isKnown {
|
||||
distancesMap[pathIfWeGoFromCurrent.StringKey()] = pathIfWeGoFromCurrent.totalLength
|
||||
// log.Printf("not known for %s \n", pathIfWeGoFromCurrent.StringKey())
|
||||
checking = append(checking, pathIfWeGoFromCurrent)
|
||||
}
|
||||
if pathIfWeGoFromCurrent.totalLength < distFromThatSide {
|
||||
f.Paths[neighborCoord] = &pathIfWeGoFromCurrent
|
||||
// log.Printf("got update for %s \n", pathIfWeGoFromCurrent.StringKey())
|
||||
distancesMap[pathIfWeGoFromCurrent.StringKey()] = pathIfWeGoFromCurrent.totalLength
|
||||
checking = append(checking, pathIfWeGoFromCurrent)
|
||||
} else {
|
||||
continue // this path is better than existing
|
||||
}
|
||||
}
|
||||
// f.Paths[currentCoord].done = true
|
||||
checking = slices.DeleteFunc(checking, func (other PathSegmentEnd) bool { return other.stringPathSoFar == currentPath.stringPathSoFar })
|
||||
storedPath, found := f.Paths[currentCoord]
|
||||
if !found || storedPath.totalLength > currentPath.totalLength {
|
||||
f.Paths[currentCoord] = ¤tPath
|
||||
}
|
||||
// time.Sleep(time.Microsecond)
|
||||
// fmt.Print(f.printLastDirection())
|
||||
// time.Sleep(time.Second)
|
||||
}
|
||||
}
|
||||
|
||||
func (f *Field) printLastDirection() (result string) {
|
||||
result += "\n"
|
||||
for rowNum := 0; rowNum < f.Height; rowNum++ {
|
||||
for colNum := 0; colNum < f.Width; colNum++ {
|
||||
path, found := f.Paths[Coord{rowNum, colNum}]
|
||||
if !found {
|
||||
result += "."
|
||||
} else {
|
||||
result += path.lastDirection.AsSymbol()
|
||||
}
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
13
day17/example
Normal file
13
day17/example
Normal file
@@ -0,0 +1,13 @@
|
||||
2413432311323
|
||||
3215453535623
|
||||
3255245654254
|
||||
3446585845452
|
||||
4546657867536
|
||||
1438598798454
|
||||
4457876987766
|
||||
3637877979653
|
||||
4654967986887
|
||||
4564679986453
|
||||
1224686865563
|
||||
2546548887735
|
||||
4322674655533
|
||||
5
day17/example2
Normal file
5
day17/example2
Normal file
@@ -0,0 +1,5 @@
|
||||
111111111111
|
||||
999999999991
|
||||
999999999991
|
||||
999999999991
|
||||
999999999991
|
||||
10
day17/notes.org
Normal file
10
day17/notes.org
Normal file
@@ -0,0 +1,10 @@
|
||||
#+title: Notes
|
||||
* so, just traversal doesn' work,
|
||||
and it's easy to imagine why.
|
||||
my guess is that i really should put 'paths to explore' into priority queue
|
||||
|
||||
and select new ones not only by their length, but also by how far they go from the goal
|
||||
|
||||
* lot's of time for no result
|
||||
* so, for 'dijksra' don't store set of vertices,
|
||||
but of ways we've entered them
|
||||
14
day18/example
Normal file
14
day18/example
Normal file
@@ -0,0 +1,14 @@
|
||||
R 6 (#70c710)
|
||||
D 5 (#0dc571)
|
||||
L 2 (#5713f0)
|
||||
D 2 (#d2c081)
|
||||
R 2 (#59c680)
|
||||
D 2 (#411b91)
|
||||
L 5 (#8ceee2)
|
||||
U 2 (#caa173)
|
||||
L 1 (#1b58a2)
|
||||
U 2 (#caa171)
|
||||
R 2 (#7807d2)
|
||||
U 3 (#a77fa3)
|
||||
L 2 (#015232)
|
||||
U 2 (#7a21e3)
|
||||
8
day18/example2
Normal file
8
day18/example2
Normal file
@@ -0,0 +1,8 @@
|
||||
R 6 (#70c710)
|
||||
D 5 (#0dc571)
|
||||
L 2 (#5713f0)
|
||||
U 2 (#d2c081)
|
||||
L 2 (#59c680)
|
||||
D 2 (#411b91)
|
||||
L 2 (#8ceee2)
|
||||
U 5 (#d2c081)
|
||||
538
day18/lagoon.go
Normal file
538
day18/lagoon.go
Normal file
@@ -0,0 +1,538 @@
|
||||
package day18
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"slices"
|
||||
"strconv"
|
||||
"strings"
|
||||
"sync"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
log.Println("hello day 18")
|
||||
log.Println("problem of lagoon bgins")
|
||||
filename := "day18/input"
|
||||
instructions := ReadInstructionas2(filename)
|
||||
h, w := calcHeightWidth(instructions)
|
||||
log.Printf("read %+v instructions", instructions)
|
||||
|
||||
field := CreateField(h, w)
|
||||
|
||||
// fmt.Println(field.String())
|
||||
borderAmount := field.digByInstructions(instructions)
|
||||
// log.Println(">>> created field", field.BordersFromLeft)
|
||||
|
||||
// fmt.Println(field.String())
|
||||
// WriteToFile("borders.txt", field.String())
|
||||
// convert -size 3000x6000 xc:white -font "FreeMono" -pointsize 13 -fill black -draw @borders.txt borders.png
|
||||
|
||||
log.Printf("starting dig inside for cols %d-%d and rows %d-%d ", field.MinCol, field.MaxCol, field.MinRow, field.MaxRow)
|
||||
insideAmount := field.digInsides()
|
||||
|
||||
log.Printf("border is %d; inside is %d", borderAmount, insideAmount)
|
||||
// fmt.Println(field.String())
|
||||
// fmt.Println(field.Height, field.Width)
|
||||
// WriteToFile("fulldug.txt", field.String())
|
||||
// convert -size 3000x6000 xc:white -font "FreeMono" -pointsize 13 -fill black -draw @fulldug.txt fulldug.png
|
||||
|
||||
// field.countDugOut()
|
||||
return borderAmount + insideAmount
|
||||
}
|
||||
|
||||
// determine size of field. max(sum(up), sum(down)) for height,
|
||||
// same for left and right,
|
||||
// translate (0,0) into center of the field
|
||||
//
|
||||
// have cells, with coord. and i guess four sides, with color.
|
||||
// i guess have directions, map[direction]color
|
||||
// and have 'opposite' on directoin.
|
||||
// for each direction apply it to cell coord, get cell, get opposite directoin and color it
|
||||
//
|
||||
// then have method on field and cell that excavates cell and colors all neighbors
|
||||
//
|
||||
// last part is filling in isides, should be ok with horizontal scans from left by even crossings
|
||||
|
||||
type Direction int
|
||||
|
||||
const (
|
||||
Upward Direction = iota
|
||||
Downward
|
||||
Leftward
|
||||
Rightward
|
||||
)
|
||||
|
||||
func (d Direction) opposite() Direction {
|
||||
switch d {
|
||||
case Upward:
|
||||
return Downward
|
||||
case Downward:
|
||||
return Upward
|
||||
case Leftward:
|
||||
return Rightward
|
||||
case Rightward:
|
||||
return Leftward
|
||||
}
|
||||
panic("unaccounted direction")
|
||||
}
|
||||
|
||||
var DirectionNames []string = []string{"U", "D", "L", "R"}
|
||||
|
||||
func (d Direction) String() string {
|
||||
return DirectionNames[d]
|
||||
}
|
||||
func DirectionFromString(s string) Direction {
|
||||
index := slices.Index(DirectionNames, s)
|
||||
if index == -1 {
|
||||
panic(fmt.Sprint("bad direction", s))
|
||||
}
|
||||
return Direction(index)
|
||||
}
|
||||
|
||||
type Instruction struct {
|
||||
Direction Direction
|
||||
Steps int
|
||||
Color string
|
||||
}
|
||||
|
||||
func ReadInstructionas(filename string) (result []Instruction) {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading file: ", filename))
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
result = append(result, ReadInstruction(line))
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func ReadInstruction(line string) Instruction {
|
||||
fields := strings.Fields(line)
|
||||
direction := DirectionFromString(fields[0])
|
||||
steps, err := strconv.Atoi(fields[1])
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("bad steps in line: ", line))
|
||||
}
|
||||
color := fields[2][1 : len(fields[2])-1]
|
||||
|
||||
return Instruction{Direction: direction, Steps: steps, Color: color}
|
||||
}
|
||||
|
||||
func ReadInstructionas2(filename string) (result []Instruction) {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading file: ", filename))
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
result = append(result, ReadInstruction2(line))
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func ReadInstruction2(line string) Instruction {
|
||||
fields := strings.Fields(line)
|
||||
|
||||
hexDist := fields[2][2 : len(fields[2])-2]
|
||||
hexDirection := fields[2][len(fields[2])-2 : len(fields[2])-1]
|
||||
var direction Direction
|
||||
switch hexDirection {
|
||||
case "0":
|
||||
direction = Rightward
|
||||
case "1":
|
||||
direction = Downward
|
||||
case "2":
|
||||
direction = Leftward
|
||||
case "3":
|
||||
direction = Upward
|
||||
}
|
||||
|
||||
dist, err := strconv.ParseUint(hexDist, 16, 64)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return Instruction{
|
||||
Steps: int(dist),
|
||||
Direction: direction,
|
||||
}
|
||||
}
|
||||
|
||||
func calcHeightWidth(instructions []Instruction) (height, width int) {
|
||||
movements := make(map[Direction]int)
|
||||
for _, instr := range instructions {
|
||||
movements[instr.Direction] += instr.Steps
|
||||
}
|
||||
if movements[Downward] > movements[Upward] {
|
||||
height = 2 * movements[Downward]
|
||||
} else {
|
||||
height = 2 * movements[Upward]
|
||||
}
|
||||
|
||||
if movements[Leftward] > movements[Rightward] {
|
||||
width = 2 * movements[Leftward]
|
||||
} else {
|
||||
width = 2 * movements[Rightward]
|
||||
}
|
||||
|
||||
height += 10
|
||||
width += 10
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
type Coord struct {
|
||||
Col, Row int
|
||||
}
|
||||
|
||||
func (c Coord) applyDirection(d Direction) Coord {
|
||||
switch d {
|
||||
case Upward:
|
||||
c.Row -= 1
|
||||
case Downward:
|
||||
c.Row += 1
|
||||
case Leftward:
|
||||
c.Col -= 1
|
||||
case Rightward:
|
||||
c.Col += 1
|
||||
}
|
||||
|
||||
return c
|
||||
}
|
||||
|
||||
type Cell struct {
|
||||
IsDug bool
|
||||
ToBeDug bool
|
||||
Coord Coord
|
||||
}
|
||||
|
||||
type BorderSymbol rune
|
||||
|
||||
// ” always left to right
|
||||
const (
|
||||
Vertical BorderSymbol = '|'
|
||||
ToDown BorderSymbol = '7'
|
||||
ToUp BorderSymbol = 'J'
|
||||
FromUp BorderSymbol = 'F'
|
||||
FromDown BorderSymbol = 'L'
|
||||
)
|
||||
|
||||
type Field struct {
|
||||
Height, Width int
|
||||
// Cells [][]*Cell
|
||||
Cells map[Coord]*Cell
|
||||
MinRow, MaxRow, MinCol, MaxCol int
|
||||
BordersFromLeft map[int]map[int]BorderSymbol
|
||||
}
|
||||
|
||||
func (f *Field) confirmCoord(c Coord) {
|
||||
// log.Printf("configming coord %+v", c)
|
||||
|
||||
if c.Row-3 < f.MinRow {
|
||||
f.MinRow = c.Row - 3
|
||||
}
|
||||
if c.Row+3 > f.MaxRow {
|
||||
f.MaxRow = c.Row + 3
|
||||
}
|
||||
if c.Col-3 < f.MinCol {
|
||||
f.MinCol = c.Col - 3
|
||||
}
|
||||
if c.Col+3 > f.MaxCol {
|
||||
f.MaxCol = c.Col + 3
|
||||
}
|
||||
}
|
||||
|
||||
func CreateField(height, width int) Field {
|
||||
return Field{
|
||||
Height: height, Width: width,
|
||||
Cells: make(map[Coord]*Cell),
|
||||
BordersFromLeft: make(map[int]map[int]BorderSymbol),
|
||||
}
|
||||
}
|
||||
|
||||
func PutSymbIntoMMMMap(mmmap map[int]map[int]BorderSymbol, row, col int, symb BorderSymbol) {
|
||||
rowMap := mmmap[row]
|
||||
if rowMap == nil {
|
||||
rowMap = make(map[int]BorderSymbol)
|
||||
mmmap[row] = rowMap
|
||||
}
|
||||
rowMap[col] = symb
|
||||
}
|
||||
|
||||
func (f *Field) digByInstructions(instructions []Instruction) (borderAmount int) {
|
||||
// for the last turn
|
||||
instructions = append(instructions, instructions[0])
|
||||
// but also don't overcount the border
|
||||
borderAmount -= instructions[0].Steps
|
||||
|
||||
runnerCoord := Coord{Col: 0, Row: 0}
|
||||
// f.Cells[runnerCoord] = &Cell{
|
||||
// IsDug: true,
|
||||
// }
|
||||
// f.confirmCoord(runnerCoord) // should be confirmed when the cycle is closed on last step
|
||||
// borderAmount += 1
|
||||
|
||||
var prevInstruction Instruction
|
||||
firstInstruction := true
|
||||
for _, instruction := range instructions {
|
||||
log.Printf("starting new instruction %+v", instruction)
|
||||
if !firstInstruction {
|
||||
turn := getTurnAsIfGoingFromLeft(prevInstruction.Direction, instruction.Direction)
|
||||
for _, theTurn := range turn {
|
||||
// log.Printf(">> putting turn %s", string(turn))
|
||||
PutSymbIntoMMMMap(f.BordersFromLeft, runnerCoord.Row, runnerCoord.Col, theTurn)
|
||||
}
|
||||
}
|
||||
firstInstruction = false
|
||||
// log.Printf("starting instruction %+v", instruction)
|
||||
for i := 0; i < instruction.Steps; i++ {
|
||||
runnerCoord = runnerCoord.applyDirection(instruction.Direction)
|
||||
// f.Cells[runnerCoord] = &Cell{
|
||||
// IsDug: true,
|
||||
// }
|
||||
f.confirmCoord(runnerCoord)
|
||||
borderAmount += 1
|
||||
// log.Printf("inside %+v updated border amount to %d", instruction, borderAmount)
|
||||
|
||||
if instruction.Direction == Upward || instruction.Direction == Downward {
|
||||
_, alreadyCountedTurn := f.BordersFromLeft[runnerCoord.Row][runnerCoord.Col]
|
||||
if !alreadyCountedTurn {
|
||||
PutSymbIntoMMMMap(f.BordersFromLeft, runnerCoord.Row, runnerCoord.Col, Vertical)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
prevInstruction = instruction
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func getTurnAsIfGoingFromLeft(directionFrom, directionTo Direction) []BorderSymbol {
|
||||
// log.Printf("getTurnAsIfGoingFromLeft from %s to %s", directionFrom.String(), directionTo.String())
|
||||
|
||||
var symbol BorderSymbol
|
||||
if directionTo == Rightward && directionFrom == Upward {
|
||||
symbol = FromUp
|
||||
}
|
||||
if directionTo == Rightward && directionFrom == Downward {
|
||||
symbol = FromDown
|
||||
}
|
||||
if directionTo == Leftward && directionFrom == Upward {
|
||||
symbol = ToDown
|
||||
}
|
||||
if directionTo == Leftward && directionFrom == Downward {
|
||||
symbol = ToUp
|
||||
}
|
||||
|
||||
if directionFrom == Rightward && directionTo == Upward {
|
||||
symbol = ToUp
|
||||
}
|
||||
if directionFrom == Rightward && directionTo == Downward {
|
||||
symbol = ToDown
|
||||
}
|
||||
if directionFrom == Leftward && directionTo == Upward {
|
||||
symbol = FromDown
|
||||
}
|
||||
if directionFrom == Leftward && directionTo == Downward {
|
||||
symbol = FromUp
|
||||
}
|
||||
|
||||
// panic(fmt.Sprint("got strange from %s to %s", directionFrom.String(), directionTo.String()))
|
||||
return []BorderSymbol{symbol}
|
||||
}
|
||||
|
||||
func (f *Field) String() string {
|
||||
s := "text 15,15 \""
|
||||
|
||||
for row := f.MinRow; row <= f.MaxRow; row++ {
|
||||
rowChars := make([]rune, f.MaxCol-f.MinCol+1)
|
||||
for col := f.MinCol; col <= f.MaxCol; col++ {
|
||||
|
||||
rowBords := f.BordersFromLeft[row]
|
||||
if rowBords != nil {
|
||||
bord, exists := rowBords[col]
|
||||
if exists {
|
||||
rowChars[col-f.MinCol] = rune(bord)
|
||||
continue
|
||||
}
|
||||
}
|
||||
cell := f.Cells[Coord{col, row}]
|
||||
if cell != nil && cell.ToBeDug {
|
||||
rowChars[col-f.MinCol] = '@'
|
||||
continue
|
||||
}
|
||||
|
||||
if f.isCellDug(row, col) {
|
||||
rowChars[col-f.MinCol] = '#'
|
||||
} else {
|
||||
rowChars[col-f.MinCol] = '.'
|
||||
}
|
||||
}
|
||||
|
||||
s += string(rowChars)
|
||||
s += "\n"
|
||||
}
|
||||
s += "\""
|
||||
return s
|
||||
}
|
||||
func (f *Field) digInsides() (result int) {
|
||||
lineSum := make(chan int)
|
||||
|
||||
var wg sync.WaitGroup
|
||||
rowsCount := f.MaxRow - f.MinRow
|
||||
wg.Add(rowsCount)
|
||||
|
||||
done := make(chan bool)
|
||||
|
||||
go func() {
|
||||
wg.Wait()
|
||||
close(lineSum)
|
||||
}()
|
||||
|
||||
go func() {
|
||||
for rowInternalCount := range lineSum {
|
||||
result += rowInternalCount
|
||||
}
|
||||
close(done)
|
||||
}()
|
||||
|
||||
for row := f.MinRow; row < f.MaxRow; row++ {
|
||||
go func(row int){
|
||||
if row%10000 == 0 {
|
||||
log.Printf("processed rows %d out of %d", row, f.MaxRow)
|
||||
}
|
||||
specialBorders := f.BordersFromLeft[row]
|
||||
if len(specialBorders) == 0 {
|
||||
wg.Done()
|
||||
return
|
||||
}
|
||||
type BorderItem struct {
|
||||
border BorderSymbol
|
||||
col int
|
||||
}
|
||||
rowBorders := make([]BorderItem, 0)
|
||||
for col, borderSymbol := range specialBorders {
|
||||
rowBorders = append(rowBorders, BorderItem{borderSymbol, col})
|
||||
}
|
||||
slices.SortFunc(rowBorders, func(a BorderItem, b BorderItem) int {
|
||||
return a.col - b.col
|
||||
})
|
||||
|
||||
// log.Printf(">>>>>>> for row %d sorted %+v", row, rowBorders)
|
||||
prevBorder := rowBorders[0]
|
||||
bordersCrossed := 0
|
||||
if prevBorder.border == Vertical {
|
||||
bordersCrossed += 1
|
||||
}
|
||||
for _, specialBorder := range rowBorders[1:] {
|
||||
diff := specialBorder.col - prevBorder.col - 1
|
||||
|
||||
if specialBorder.border == ToUp && prevBorder.border == FromUp {
|
||||
bordersCrossed += 1
|
||||
prevBorder = specialBorder
|
||||
continue
|
||||
}
|
||||
if specialBorder.border == ToDown && prevBorder.border == FromDown {
|
||||
bordersCrossed += 1
|
||||
prevBorder = specialBorder
|
||||
continue
|
||||
}
|
||||
if specialBorder.border == ToUp && prevBorder.border == FromDown {
|
||||
prevBorder = specialBorder
|
||||
continue
|
||||
}
|
||||
if specialBorder.border == ToDown && prevBorder.border == FromUp {
|
||||
prevBorder = specialBorder
|
||||
continue
|
||||
}
|
||||
|
||||
if bordersCrossed%2 == 1 { // is in
|
||||
for col := prevBorder.col + 1; col < specialBorder.col; col++ {
|
||||
// f.Cells[Coord{Col: col, Row: row}] = &Cell{
|
||||
// ToBeDug: true,
|
||||
// }
|
||||
}
|
||||
lineSum <- diff
|
||||
// countInside += diff
|
||||
}
|
||||
|
||||
if specialBorder.border == Vertical {
|
||||
bordersCrossed += 1
|
||||
}
|
||||
|
||||
prevBorder = specialBorder
|
||||
}
|
||||
|
||||
wg.Done()
|
||||
}(row)
|
||||
}
|
||||
|
||||
<-done
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// func (f *Field) digInsides() (countInside int) {
|
||||
// for row := f.MinRow; row < f.MaxRow; row++ {
|
||||
// if row % 10000 == 0 {
|
||||
// log.Printf("processed rows %d out of %d", row, f.MaxRow)
|
||||
// }
|
||||
// isInside := false
|
||||
// seenUp, seenDown := false, false // for detecting L---7 walls
|
||||
// for col := f.MinCol; col < f.MaxCol; col++ {
|
||||
// // TODO next optimization - for each row, store indices of cols with border cells
|
||||
// // so that count of inside would be done by many at a time
|
||||
// rightCellIsDug := f.isCellDug(row, col+1)
|
||||
// if f.isCellDug(row, col) {
|
||||
// upCellIsDug := f.isCellDug(row-1, col)
|
||||
// downCellIsDug := f.isCellDug(row+1, col)
|
||||
// if !rightCellIsDug {
|
||||
// if (upCellIsDug && seenDown) || (downCellIsDug && seenUp) {
|
||||
// isInside = !isInside
|
||||
// }
|
||||
// seenUp, seenDown = false, false
|
||||
// }
|
||||
// } else {
|
||||
// // not a dug out cell, maybe inside and needs to be dug out
|
||||
// if isInside {
|
||||
// // f.Cells[Coord{col, row}] = &Cell{
|
||||
// // ToBeDug: true,
|
||||
// // }
|
||||
|
||||
// countInside += 1
|
||||
// // log.Printf("tick count inside for %d %d", row, col)
|
||||
// // cellPtr.ToBeDug = true
|
||||
// }
|
||||
// if rightCellIsDug {
|
||||
// seenUp = f.isCellDug(row-1, col+1)
|
||||
// seenDown = f.isCellDug(row+1, col+1)
|
||||
// }
|
||||
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// return
|
||||
// }
|
||||
|
||||
func (f *Field) isCellDug(row, col int) bool {
|
||||
cell := f.Cells[Coord{col, row}]
|
||||
return cell != nil && cell.IsDug
|
||||
}
|
||||
|
||||
func WriteToFile(filename string, content string) {
|
||||
fileBorder, err := os.Create(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
defer func() {
|
||||
if err := fileBorder.Close(); err != nil {
|
||||
panic(err)
|
||||
}
|
||||
}()
|
||||
|
||||
fileBorder.WriteString(content)
|
||||
}
|
||||
51
day18/notes.org
Normal file
51
day18/notes.org
Normal file
@@ -0,0 +1,51 @@
|
||||
#+title: Notes
|
||||
* part 2 and i'm struggling.
|
||||
maybe i need to mark 'inside' cells while i dig?
|
||||
i don't know which is 'outside' from the getgo?
|
||||
|
||||
if i mark 'all the rightside', will that help to calculate inside?
|
||||
* well, if we dont' have instruction with steps:1 i can just count points above and belove the line
|
||||
without more complicated things
|
||||
|
||||
just count 'seenUp' and 'seenDown' if equal - then we changed side
|
||||
|
||||
and - we shouldn't have 'step1' because all numbers are soooo big.
|
||||
|
||||
ok. let's do that? with maps of cols.
|
||||
** CANCELLED add map[row]map[col]any
|
||||
** CANCELLED separate method to set it up after we have all of the BorderCellCols
|
||||
** CANCELLED during digInsides on each consecutive - check above and below and count
|
||||
when there's a jump - compare counts, to make decision on whether to switch 'isInside'
|
||||
** no. just because they are long doesn't mean they won't ever get one near another
|
||||
* another idea is to save | and corners, as if we're going from left to right
|
||||
this seems reasonable.
|
||||
** DONE i guess []SpecialSymbol which has Col and Symbol
|
||||
** DONE no, let's make it map. yes will have to init, but yuck anyway
|
||||
** TODO then different logic on border building.
|
||||
if U \ D - on all but last add '|'
|
||||
on last - calc with the next turn, what should be saved 'as if traversing from the left'
|
||||
|
||||
for L \ R - on last - calc what the turn was
|
||||
** TODO !! between last and first movement the corner is unknown.
|
||||
so, copy the first instruction to the end?
|
||||
** moment of hope.
|
||||
my calculation for example input for part 2
|
||||
day18 result: 952408144115
|
||||
|
||||
952408144115
|
||||
*** YES.
|
||||
*** about 1M for 4 minutes
|
||||
** so, my input is ~16M rows
|
||||
3.5 seconds per 10k
|
||||
** well, maybe i can parallel.
|
||||
*** parallel example
|
||||
day18 result: 952407566854
|
||||
*** and with separate done channel from the summing goroutine
|
||||
952408144115
|
||||
**** YES
|
||||
** and
|
||||
2023/12/18 23:35:31 border is 195341588; inside is 148441957805559
|
||||
2023/12/18 23:35:31
|
||||
|
||||
day18 result: 148442153147147
|
||||
* i should have used a formula. maybe then it would taken less than 4 hours
|
||||
17
day19/example
Normal file
17
day19/example
Normal file
@@ -0,0 +1,17 @@
|
||||
px{a<2006:qkq,m>2090:A,rfg}
|
||||
pv{a>1716:R,A}
|
||||
lnx{m>1548:A,A}
|
||||
rfg{s<537:gd,x>2440:R,A}
|
||||
qs{s>3448:A,lnx}
|
||||
qkq{x<1416:A,crn}
|
||||
crn{x>2662:A,R}
|
||||
in{s<1351:px,qqz}
|
||||
qqz{s>2770:qs,m<1801:hdj,R}
|
||||
gd{a>3333:R,R}
|
||||
hdj{m>838:A,pv}
|
||||
|
||||
{x=787,m=2655,a=1222,s=2876}
|
||||
{x=1679,m=44,a=2067,s=496}
|
||||
{x=2036,m=264,a=79,s=2244}
|
||||
{x=2461,m=1339,a=466,s=291}
|
||||
{x=2127,m=1623,a=2188,s=1013}
|
||||
5
day19/example1
Normal file
5
day19/example1
Normal file
@@ -0,0 +1,5 @@
|
||||
in{x<4000:R,m<4000:R,A}
|
||||
px{a<4000:R,A}
|
||||
qqz{s>2770:R,m<1801:A,R}
|
||||
|
||||
{x=787,m=2655,a=1222,s=2876}
|
||||
57
day19/intervals.go
Normal file
57
day19/intervals.go
Normal file
@@ -0,0 +1,57 @@
|
||||
package day19
|
||||
|
||||
import (
|
||||
"sort"
|
||||
)
|
||||
|
||||
func merge(intervals [][]int) [][]int {
|
||||
const start, end = 0, 1
|
||||
|
||||
var merged [][]int
|
||||
|
||||
if len(intervals) > 1 {
|
||||
sort.Slice(intervals, func(i, j int) bool {
|
||||
return intervals[i][start] < intervals[j][start]
|
||||
})
|
||||
}
|
||||
|
||||
for _, interval := range intervals {
|
||||
last := len(merged) - 1
|
||||
if last < 0 || interval[start] > merged[last][end] {
|
||||
merged = append(merged,
|
||||
[]int{start: interval[start], end: interval[end]},
|
||||
)
|
||||
} else if interval[end] > merged[last][end] {
|
||||
merged[last][end] = interval[end]
|
||||
}
|
||||
}
|
||||
|
||||
return merged[:len(merged):len(merged)]
|
||||
}
|
||||
|
||||
func applyLessThan(intervals [][]int, n int) [][]int {
|
||||
var lessers [][]int
|
||||
for _, interval := range intervals {
|
||||
from := interval[0]
|
||||
if from >= n {
|
||||
continue
|
||||
}
|
||||
lessers = append(lessers, []int{from, n-1})
|
||||
}
|
||||
|
||||
return lessers
|
||||
}
|
||||
|
||||
func applyMoreThan(intervals [][]int, n int) [][]int {
|
||||
var greaters [][]int
|
||||
for _, interval := range intervals {
|
||||
to := interval[1]
|
||||
if to <= n {
|
||||
continue
|
||||
}
|
||||
greaters = append(greaters, []int{n+1, to})
|
||||
}
|
||||
// log.Printf(">>>> in applyMoreThan %d to %+v ; result %+v", n, intervals, greaters)
|
||||
|
||||
return greaters
|
||||
}
|
||||
47
day19/notes.org
Normal file
47
day19/notes.org
Normal file
@@ -0,0 +1,47 @@
|
||||
#+title: Notes
|
||||
* testing things
|
||||
|
||||
testSorter := day19.ReadSorterLine("qqz{s>2770:qs,m<1801:hdj,R}")
|
||||
log.Printf("my test sorter is %+v", testSorter)
|
||||
|
||||
testOperation := day19.ReadOperationLine("s>2770:qs")
|
||||
log.Println(testOperation)
|
||||
** testing simplification
|
||||
lnx{m>1548:A,A}
|
||||
qqz{s>2770:qs,m<1801:hdj,R}
|
||||
kt{m>2215:R,x>3386:A,x<3107:R,R}
|
||||
|
||||
testSorter := day19.ReadSorterLine("kt{m>2215:R,x>3386:A,x<3107:R,R}")
|
||||
log.Printf("my test sorter is %+v", testSorter)
|
||||
|
||||
simplified := day19.SimplifyOperation(testSorter)
|
||||
log.Printf("> simplivied %+v", simplified)
|
||||
* i probably don't need 'actual actors'
|
||||
just a generic function that takes 'detail' and 'sorterData'
|
||||
then applies sorterData to the detail,
|
||||
and calls itself with new sorter
|
||||
|
||||
with special cases for "R" and "A"
|
||||
|
||||
so. have funciton from OpeartionData & Detail -> true/false
|
||||
if true take the destination, if false, check next
|
||||
* well. only way to do this is with intervals
|
||||
|
||||
so, sorter check takes in interval.
|
||||
|
||||
then for each of the rule,
|
||||
call first rule with full interval,
|
||||
deduct first rule (for those that don't match) and pass to second.
|
||||
deduct second and pass to next
|
||||
|
||||
A will return full
|
||||
R will return empty
|
||||
|
||||
and results from each rule application should be joined
|
||||
|
||||
so. i need interval deduction
|
||||
and i need interval join
|
||||
* found a bug in always using initial intervals to calculate 'failing' after each step
|
||||
2023/12/19 11:45:14 got and checked 167409079868000
|
||||
|
||||
In the above example, there are 167409079868000 distinct combinations of ratings that will be accepted.
|
||||
341
day19/sortingParts.go
Normal file
341
day19/sortingParts.go
Normal file
@@ -0,0 +1,341 @@
|
||||
package day19
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
"sync"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello day 19. sorting parts")
|
||||
filename := "day19/input"
|
||||
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("cannot read file ", filename))
|
||||
}
|
||||
|
||||
text := string(bytes)
|
||||
|
||||
split := strings.Split(text, "\n\n")
|
||||
|
||||
sorters := ReadSorters(split[0])
|
||||
details := ReadDetailsPart(split[1])
|
||||
|
||||
log.Printf("yay, got sorters\n%+v\nand details\n%+v", sorters, details)
|
||||
|
||||
// countApproved := CountApprovedDetails(details, sorters)
|
||||
result := 0
|
||||
|
||||
fullIntervals := AttrIntervals{
|
||||
"x": [][]int{[]int{1, 4000}},
|
||||
"m": [][]int{[]int{1, 4000}},
|
||||
"a": [][]int{[]int{1, 4000}},
|
||||
"s": [][]int{[]int{1, 4000}},
|
||||
}
|
||||
|
||||
andChecked := processInterval(fullIntervals, "in", sorters)
|
||||
log.Print("got and checked ", andChecked)
|
||||
result = andChecked
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func CountApprovedDetails(details []DetailData, sorters map[string]SorterData) int {
|
||||
var wg sync.WaitGroup
|
||||
wg.Add(len(details))
|
||||
|
||||
approvedDetails := make(chan DetailData)
|
||||
|
||||
go func() {
|
||||
wg.Wait()
|
||||
close(approvedDetails)
|
||||
}()
|
||||
|
||||
count := 0
|
||||
acceptedScore := 0
|
||||
|
||||
done := make(chan any)
|
||||
|
||||
go func() {
|
||||
for detail := range approvedDetails {
|
||||
log.Println("got approved ", detail)
|
||||
count += 1
|
||||
for _, attrValue := range detail.Attrs {
|
||||
acceptedScore += attrValue
|
||||
}
|
||||
}
|
||||
close(done)
|
||||
}()
|
||||
|
||||
for _, d := range details {
|
||||
go func(d DetailData) {
|
||||
log.Print("> starting for ", d)
|
||||
isAccepted := ProcessDetail(d, sorters)
|
||||
if isAccepted {
|
||||
log.Println("> accepting ", d)
|
||||
approvedDetails <- d
|
||||
} else {
|
||||
log.Println("> rejecting ", d)
|
||||
}
|
||||
wg.Done()
|
||||
}(d)
|
||||
}
|
||||
|
||||
<-done
|
||||
|
||||
return acceptedScore
|
||||
}
|
||||
|
||||
type Operation rune
|
||||
|
||||
const (
|
||||
LessThan Operation = '<'
|
||||
MoreThan Operation = '>'
|
||||
)
|
||||
func (o Operation)String() string {
|
||||
return string(o)
|
||||
}
|
||||
|
||||
type OperationData struct {
|
||||
AttrName string
|
||||
Operation Operation
|
||||
Num int
|
||||
SentToName string
|
||||
InitialString string
|
||||
}
|
||||
func (od OperationData)String() string {
|
||||
return od.InitialString
|
||||
}
|
||||
|
||||
type SorterData struct {
|
||||
Name string
|
||||
DefaultState string
|
||||
Operations []OperationData
|
||||
}
|
||||
|
||||
func ReadSorters(sortersText string) map[string]SorterData {
|
||||
result := make(map[string]SorterData)
|
||||
sortersText = strings.TrimSpace(sortersText)
|
||||
lines := strings.Split(sortersText, "\n")
|
||||
|
||||
for _, line := range lines {
|
||||
sorter := SimplifyOperation( ReadSorterLine(line) )
|
||||
result[sorter.Name] = sorter
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// qqz{s>2770:qs,m<1801:hdj,R}
|
||||
func ReadSorterLine(line string) (result SorterData) {
|
||||
re1 := regexp.MustCompile(`(?P<NAME>\D+){(?P<OPERATIONS>.+)}`)
|
||||
|
||||
firstSplit := re1.FindStringSubmatch(line)
|
||||
|
||||
result.Name = firstSplit[1]
|
||||
|
||||
operationLines := strings.Split(firstSplit[2], ",")
|
||||
operations := make([]OperationData, len(operationLines)-1)
|
||||
result.Operations = operations
|
||||
|
||||
result.DefaultState = operationLines[len(operationLines)-1]
|
||||
|
||||
for i, line := range operationLines[:len(operationLines)-1] {
|
||||
operations[i] = ReadOperationLine(line)
|
||||
}
|
||||
|
||||
log.Printf("mathed %s got %+v; operations : %+v\n", line, firstSplit, operations)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
// s>2770:qs
|
||||
func ReadOperationLine(line string) (result OperationData) {
|
||||
result.InitialString = line
|
||||
re := regexp.MustCompile(`(?P<ATTRNAME>\D)(?P<OPERATION>[\>\<])(?P<NUMBER>\d+):(?P<TARGET>\D+)`)
|
||||
split := re.FindStringSubmatch(line)
|
||||
log.Printf("matching operation %s into %+v\n", line, split)
|
||||
result.AttrName = split[1]
|
||||
result.Operation = Operation([]rune(split[2])[0])
|
||||
result.SentToName = split[4]
|
||||
num, err := strconv.Atoi(split[3])
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("error getting number %s in line %s. %s", split[3], line, err))
|
||||
}
|
||||
result.Num = num
|
||||
return
|
||||
}
|
||||
|
||||
// drop last operations which target same 'next' as default. these check are not necessary
|
||||
func SimplifyOperation(sorter SorterData) SorterData {
|
||||
actualLast := len(sorter.Operations) - 1
|
||||
for i := actualLast; i >= 0; i-- {
|
||||
if sorter.Operations[i].SentToName != sorter.DefaultState {
|
||||
break
|
||||
}
|
||||
actualLast -= 1
|
||||
}
|
||||
|
||||
sorter.Operations = sorter.Operations[:actualLast+1]
|
||||
|
||||
return sorter
|
||||
}
|
||||
|
||||
type DetailData struct {
|
||||
Attrs map[string]int
|
||||
}
|
||||
|
||||
func ReadDetailsPart(text string) (result []DetailData) {
|
||||
text = strings.TrimSpace(text)
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
result = append(result, ReadDetailLine(line))
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
// {x=787,m=2655,a=1222,s=2876}
|
||||
func ReadDetailLine(line string) (result DetailData) {
|
||||
attrs := make(map[string]int)
|
||||
result.Attrs = attrs
|
||||
line = line[1 : len(line)-1]
|
||||
attrsLine := strings.Split(line, ",")
|
||||
re := regexp.MustCompile(`(?P<ATTR>\D)=(?P<NUM>\d+)`)
|
||||
for _, attrLine := range attrsLine {
|
||||
split := re.FindStringSubmatch(attrLine)
|
||||
attrName := split[1]
|
||||
num, err := strconv.Atoi(split[2])
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error parsing detail ", line))
|
||||
}
|
||||
|
||||
attrs[attrName] = num
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func ProcessDetail(d DetailData, sorters map[string]SorterData) (isAccepted bool) {
|
||||
curSorterName := "in"
|
||||
for curSorterName != "A" && curSorterName != "R" {
|
||||
sorter, found := sorters[curSorterName]
|
||||
if !found {
|
||||
panic(fmt.Sprint("error finding soter ", curSorterName))
|
||||
}
|
||||
curSorterName = sorter.NextSorterNameFor(d)
|
||||
}
|
||||
return curSorterName == "A"
|
||||
}
|
||||
|
||||
func (s SorterData) NextSorterNameFor(d DetailData) string {
|
||||
for _, operation := range s.Operations {
|
||||
if operation.IsDetailPassing(d) {
|
||||
return operation.SentToName
|
||||
}
|
||||
}
|
||||
|
||||
return s.DefaultState
|
||||
}
|
||||
|
||||
func (o OperationData) IsDetailPassing(d DetailData) bool {
|
||||
detailValue := d.Attrs[o.AttrName]
|
||||
switch o.Operation {
|
||||
case LessThan:
|
||||
return detailValue < o.Num
|
||||
case MoreThan:
|
||||
return detailValue > o.Num
|
||||
}
|
||||
|
||||
panic(fmt.Sprint("unknown operation. ", o, d))
|
||||
}
|
||||
|
||||
type AttrIntervals map[string][][]int
|
||||
|
||||
func (o OperationData) getPassingIntervals(i AttrIntervals) AttrIntervals {
|
||||
result := make(AttrIntervals, 0)
|
||||
for key, value := range i {
|
||||
result[key] = value
|
||||
}
|
||||
|
||||
operationKey := o.AttrName
|
||||
operatedIntervals := result[operationKey]
|
||||
|
||||
switch o.Operation {
|
||||
case LessThan:
|
||||
result[operationKey] = applyLessThan(operatedIntervals, o.Num)
|
||||
case MoreThan:
|
||||
result[operationKey] = applyMoreThan(operatedIntervals, o.Num)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func (o OperationData) getFailingIntervals(i AttrIntervals) AttrIntervals {
|
||||
result := make(AttrIntervals, 0)
|
||||
for key, value := range i {
|
||||
result[key] = value
|
||||
}
|
||||
|
||||
operationKey := o.AttrName
|
||||
operatedIntervals := result[operationKey]
|
||||
|
||||
switch o.Operation {
|
||||
case LessThan:
|
||||
result[operationKey] = applyMoreThan(operatedIntervals, o.Num-1)
|
||||
case MoreThan:
|
||||
result[operationKey] = applyLessThan(operatedIntervals, o.Num+1)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func processInterval(i AttrIntervals, sorterName string, sorters map[string]SorterData) (combinationsAccepted int) {
|
||||
if sorterName == "A" {
|
||||
mul := 1
|
||||
for key, attrIntervals := range i {
|
||||
allowedValuesOfAttr := 0
|
||||
for _, interval := range attrIntervals {
|
||||
from := interval[0]
|
||||
to := interval[1]
|
||||
len := to - from + 1
|
||||
allowedValuesOfAttr += len
|
||||
log.Printf("for %s allowed attrs are %d", key, allowedValuesOfAttr)
|
||||
}
|
||||
mul *= allowedValuesOfAttr
|
||||
}
|
||||
log.Printf("exit recursion for %s. Accept interval %+v . result %d. max is %d", sorterName, i, mul, 40000 * 4000 * 4000 * 4000)
|
||||
return mul
|
||||
}
|
||||
if sorterName == "R" {
|
||||
return 0
|
||||
}
|
||||
|
||||
s := sorters[sorterName]
|
||||
log.Printf("> starting interval check for %s (%+v) on %+v", sorterName, s, i)
|
||||
intervalsPassingOnThisStep := i
|
||||
|
||||
for _, operation := range s.Operations {
|
||||
intervalsPassing := operation.getPassingIntervals(intervalsPassingOnThisStep)
|
||||
log.Printf(">> %s; in operation %+v. passing are %+v", sorterName, operation, intervalsPassing)
|
||||
ofThoseAreAccepted := processInterval(intervalsPassing, operation.SentToName, sorters)
|
||||
|
||||
combinationsAccepted += ofThoseAreAccepted
|
||||
log.Printf(">> %s; results so far are %d", sorterName, combinationsAccepted)
|
||||
|
||||
intervalsFailingAndPassedToNextCheck := operation.getFailingIntervals(intervalsPassingOnThisStep)
|
||||
log.Printf(">> %s; failing for the next step %+v", sorterName, intervalsFailingAndPassedToNextCheck)
|
||||
intervalsPassingOnThisStep = intervalsFailingAndPassedToNextCheck
|
||||
}
|
||||
|
||||
log.Printf(">> %s; about to go into DEFAULT", sorterName)
|
||||
intervalsAfterDefault := processInterval(intervalsPassingOnThisStep, s.DefaultState, sorters)
|
||||
log.Printf(">> %s; after defaul. passing are %+v", sorterName, intervalsAfterDefault)
|
||||
combinationsAccepted += intervalsAfterDefault
|
||||
log.Printf(">> %s; results after default %d", sorterName, combinationsAccepted)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
100
day2/input
100
day2/input
@@ -1,100 +0,0 @@
|
||||
Game 1: 4 green, 3 blue, 11 red; 7 red, 5 green, 10 blue; 3 green, 8 blue, 8 red; 4 red, 12 blue; 15 red, 3 green, 10 blue
|
||||
Game 2: 3 red, 1 blue, 2 green; 1 blue, 9 green; 1 red, 10 green
|
||||
Game 3: 5 green, 9 red, 4 blue; 3 green, 7 blue; 12 blue, 3 green, 3 red; 3 blue, 7 red, 2 green; 7 blue, 3 green, 10 red
|
||||
Game 4: 2 green, 2 blue; 12 red, 9 green, 2 blue; 13 green, 15 red, 4 blue; 14 red, 3 green, 5 blue; 6 red, 1 green; 1 blue, 2 red, 2 green
|
||||
Game 5: 2 green, 6 blue; 1 red, 3 green, 5 blue; 3 green, 4 blue; 3 blue, 5 green, 1 red; 5 blue
|
||||
Game 6: 5 green, 1 blue, 3 red; 8 green, 15 red; 16 green, 5 red, 1 blue
|
||||
Game 7: 1 blue, 3 red, 11 green; 18 red, 16 blue, 5 green; 13 blue, 5 green; 1 red, 8 green, 15 blue
|
||||
Game 8: 1 green, 14 blue, 1 red; 10 blue; 1 green
|
||||
Game 9: 4 green, 12 blue, 1 red; 14 blue; 2 blue, 4 green; 4 green, 1 red, 10 blue
|
||||
Game 10: 11 green, 9 red; 12 red, 9 green; 5 red, 7 blue, 5 green; 6 green, 1 blue, 12 red; 3 red, 3 blue; 16 red, 9 blue, 7 green
|
||||
Game 11: 11 green, 1 red, 9 blue; 2 red, 13 green, 5 blue; 5 green, 2 red, 5 blue; 5 green, 7 blue; 1 red, 5 blue, 1 green
|
||||
Game 12: 5 green, 1 red; 1 red, 4 green; 1 blue, 12 green; 15 green, 4 blue; 4 blue, 19 green; 16 green, 4 blue
|
||||
Game 13: 1 red, 9 green, 5 blue; 10 blue, 7 green, 1 red; 3 green, 2 red, 14 blue; 16 blue, 3 red
|
||||
Game 14: 9 red, 1 blue, 2 green; 16 blue, 7 red; 2 green, 3 red, 14 blue; 1 green, 9 blue
|
||||
Game 15: 6 blue; 4 blue; 1 red, 16 blue, 3 green
|
||||
Game 16: 14 green, 5 red, 1 blue; 1 red, 1 blue; 5 blue
|
||||
Game 17: 1 blue, 1 green, 3 red; 2 red, 2 blue, 2 green; 1 blue, 1 red; 1 red, 2 green, 2 blue; 2 blue; 1 green, 2 red, 1 blue
|
||||
Game 18: 4 blue, 2 green, 1 red; 1 green, 1 red, 10 blue; 1 green, 1 red, 2 blue; 1 red, 5 blue; 3 green, 6 blue; 1 red, 1 green, 7 blue
|
||||
Game 19: 1 blue, 13 green, 12 red; 7 blue, 2 green, 1 red; 1 blue, 3 red, 3 green; 3 blue, 8 green, 10 red; 7 blue, 2 green
|
||||
Game 20: 1 red, 17 blue; 10 blue, 5 green; 9 green, 1 red, 3 blue; 1 red, 5 green, 1 blue
|
||||
Game 21: 3 red, 6 blue, 5 green; 4 blue, 1 red, 7 green; 6 blue, 4 red, 9 green
|
||||
Game 22: 11 blue, 2 red, 6 green; 16 blue, 5 red, 6 green; 12 red, 2 green, 10 blue; 14 blue, 2 green, 11 red
|
||||
Game 23: 3 red, 5 green; 10 blue, 1 green, 9 red; 2 red, 10 green, 9 blue; 9 blue, 7 green
|
||||
Game 24: 8 blue, 1 red; 3 red, 9 blue; 9 green, 2 red, 8 blue
|
||||
Game 25: 2 red, 1 green, 1 blue; 1 green, 12 blue, 2 red; 2 red, 1 blue; 2 blue; 1 green, 10 blue; 6 blue
|
||||
Game 26: 2 red; 4 green, 1 red, 7 blue; 11 blue, 2 red, 4 green; 1 red, 1 blue; 1 red, 5 green, 12 blue
|
||||
Game 27: 1 red, 7 green, 8 blue; 13 green, 12 blue, 1 red; 6 red, 1 green, 10 blue; 8 red, 2 blue, 2 green; 11 blue, 4 green, 4 red
|
||||
Game 28: 1 red, 8 blue, 3 green; 12 green, 4 blue; 1 red, 4 blue, 11 green; 7 blue, 10 green, 10 red; 11 blue, 7 red, 8 green; 10 red, 2 green, 2 blue
|
||||
Game 29: 4 green, 2 red; 1 blue, 11 red; 2 blue, 3 green, 1 red; 16 red; 3 green, 8 red, 1 blue; 2 blue, 7 green, 12 red
|
||||
Game 30: 1 blue, 3 green; 4 green, 2 blue; 3 red, 5 blue; 4 green, 1 red
|
||||
Game 31: 2 red, 2 blue, 3 green; 2 green, 3 blue, 8 red; 7 red, 16 blue, 2 green; 5 red, 20 blue, 2 green
|
||||
Game 32: 2 red, 1 green, 4 blue; 4 green, 4 red, 1 blue; 4 red, 4 blue; 1 blue, 4 red, 2 green; 4 blue, 3 green, 4 red
|
||||
Game 33: 11 green, 4 blue, 10 red; 2 green, 13 red, 7 blue; 13 red, 2 blue, 8 green; 15 red, 9 blue, 12 green; 14 red, 10 green, 2 blue; 13 red, 7 green
|
||||
Game 34: 11 red, 6 blue, 4 green; 16 red, 7 blue, 4 green; 6 red, 18 green, 6 blue; 3 blue, 16 red, 3 green; 2 red, 3 blue, 17 green; 3 green, 9 red, 6 blue
|
||||
Game 35: 6 green, 10 red, 12 blue; 4 red, 1 blue, 2 green; 3 green, 8 blue, 7 red; 6 red, 12 blue, 2 green
|
||||
Game 36: 4 green, 2 blue, 2 red; 3 green, 10 red, 1 blue; 1 blue, 3 green, 2 red; 2 green, 1 red; 1 blue, 5 red
|
||||
Game 37: 3 blue, 1 red, 2 green; 8 red, 4 green, 10 blue; 4 red, 4 green
|
||||
Game 38: 13 green, 3 red, 2 blue; 1 red, 13 green, 2 blue; 20 green, 3 red, 2 blue; 1 red, 2 blue, 12 green
|
||||
Game 39: 13 blue, 1 red, 8 green; 5 red, 3 green, 8 blue; 6 blue, 4 green; 18 blue, 7 green, 1 red; 4 green, 3 blue, 5 red; 6 blue, 4 red, 1 green
|
||||
Game 40: 2 red, 2 blue, 9 green; 1 blue, 2 red, 12 green; 16 green, 11 blue, 1 red; 1 green, 2 red; 3 blue, 2 red
|
||||
Game 41: 7 blue, 1 red; 4 blue, 1 red; 3 blue, 1 red, 2 green; 13 blue
|
||||
Game 42: 18 red, 1 green, 13 blue; 2 blue, 2 green, 7 red; 16 red, 12 blue; 1 green, 10 blue, 14 red
|
||||
Game 43: 15 red, 6 green, 2 blue; 3 blue, 9 red, 3 green; 13 red
|
||||
Game 44: 2 blue, 5 green, 3 red; 4 red, 4 blue, 19 green; 5 red, 3 blue, 9 green; 19 green, 6 red, 5 blue
|
||||
Game 45: 5 red, 4 green, 13 blue; 12 red, 10 blue; 3 green, 9 blue, 5 red; 10 blue, 18 red, 5 green; 16 red, 6 green, 17 blue
|
||||
Game 46: 3 green; 3 green, 2 blue; 4 blue, 2 red, 3 green; 5 blue, 3 green, 4 red; 1 green, 1 blue
|
||||
Game 47: 2 blue, 1 red, 10 green; 2 red; 6 red, 1 blue; 16 red, 2 blue, 8 green; 5 blue, 8 red, 7 green
|
||||
Game 48: 11 green, 4 red, 2 blue; 2 blue, 5 green, 8 red; 9 green, 6 red; 3 red, 3 green, 1 blue; 2 blue, 12 green, 17 red
|
||||
Game 49: 10 blue, 4 green, 1 red; 10 red, 10 blue; 12 blue, 7 red; 13 blue, 6 green
|
||||
Game 50: 1 red, 19 green, 7 blue; 4 red, 1 green, 5 blue; 16 green, 8 red, 8 blue
|
||||
Game 51: 12 green, 18 blue; 13 green, 14 blue, 4 red; 7 green, 4 red, 14 blue; 8 green, 2 blue, 3 red; 16 blue, 8 green
|
||||
Game 52: 9 blue, 9 green, 3 red; 8 blue, 1 green, 13 red; 2 red, 8 blue, 9 green; 13 red, 4 green; 6 green, 15 red; 11 blue, 11 red, 9 green
|
||||
Game 53: 2 red, 4 green, 3 blue; 5 blue, 16 green; 4 blue, 8 red, 12 green
|
||||
Game 54: 6 red, 16 green; 6 red, 15 green; 8 green, 8 red, 2 blue
|
||||
Game 55: 9 red, 2 green; 4 blue; 2 green, 2 red, 7 blue; 1 red, 16 blue, 1 green; 17 blue, 5 red
|
||||
Game 56: 14 green, 3 red, 9 blue; 14 blue, 15 green, 2 red; 8 red, 13 blue, 15 green; 15 blue, 2 red, 12 green; 3 red, 7 blue, 10 green; 10 blue, 13 green
|
||||
Game 57: 1 blue, 10 green, 2 red; 4 blue, 9 green, 11 red; 2 blue
|
||||
Game 58: 4 red, 2 blue, 5 green; 1 blue, 5 green, 4 red; 3 green, 4 red, 8 blue; 4 blue, 7 green; 5 green, 4 blue; 1 blue, 6 red
|
||||
Game 59: 5 blue, 4 red, 3 green; 8 blue, 12 green, 5 red; 5 red, 8 blue, 15 green
|
||||
Game 60: 6 red, 12 blue, 1 green; 10 blue, 20 green, 4 red; 6 blue, 1 green, 5 red; 9 red, 12 blue, 14 green; 15 green, 1 red, 14 blue; 10 green, 13 blue
|
||||
Game 61: 1 blue, 12 green, 3 red; 4 green, 1 red, 4 blue; 8 red, 4 green, 6 blue
|
||||
Game 62: 6 blue, 7 green, 3 red; 6 blue, 3 red, 3 green; 11 green, 6 red, 2 blue; 2 red, 6 blue, 3 green; 2 green, 3 blue, 3 red; 3 blue, 11 green, 11 red
|
||||
Game 63: 5 green, 6 blue, 4 red; 6 green, 12 blue; 3 green, 9 blue, 10 red; 1 blue, 4 red, 5 green
|
||||
Game 64: 10 green, 14 red; 1 blue, 9 red; 3 green, 10 blue, 14 red; 5 green, 3 blue, 12 red; 5 blue, 12 red, 13 green
|
||||
Game 65: 1 red, 5 green, 10 blue; 14 red, 5 green, 10 blue; 10 blue, 10 red
|
||||
Game 66: 9 green, 8 blue, 1 red; 8 red, 14 blue; 8 red, 7 blue, 2 green; 4 blue, 3 green, 5 red; 2 red, 8 green, 8 blue
|
||||
Game 67: 4 red, 3 green, 3 blue; 4 green, 1 blue, 4 red; 1 blue, 3 red; 10 blue; 16 blue, 6 red, 4 green
|
||||
Game 68: 6 blue, 6 green, 9 red; 4 blue, 9 red, 3 green; 3 blue, 8 red
|
||||
Game 69: 4 green, 12 red, 3 blue; 2 red, 3 blue; 2 blue, 4 red, 2 green; 1 blue, 3 red
|
||||
Game 70: 4 red, 3 green, 15 blue; 1 green, 4 red; 1 red, 1 green, 5 blue
|
||||
Game 71: 4 blue, 2 red, 10 green; 7 red, 6 blue, 11 green; 4 blue, 7 red, 8 green
|
||||
Game 72: 9 red, 9 blue, 1 green; 4 red, 6 green, 5 blue; 3 green, 7 red, 2 blue
|
||||
Game 73: 3 green, 9 red; 4 green, 15 red; 12 red, 2 blue; 14 red, 3 green
|
||||
Game 74: 2 red, 6 blue, 1 green; 3 red, 6 blue; 1 green, 12 blue, 14 red
|
||||
Game 75: 3 green, 18 red; 1 green, 7 red, 1 blue; 2 red, 2 green, 3 blue; 11 red; 2 red, 3 green, 2 blue
|
||||
Game 76: 6 green, 2 red, 5 blue; 13 green, 5 blue; 5 blue, 1 red, 1 green
|
||||
Game 77: 4 blue, 6 green, 3 red; 15 red, 1 green; 4 green, 11 red, 13 blue; 8 blue, 6 green, 9 red; 3 blue, 1 green, 11 red; 3 green, 3 red
|
||||
Game 78: 11 green, 1 blue, 2 red; 7 red, 16 blue, 11 green; 9 blue, 10 red, 6 green; 1 green, 8 blue, 10 red; 8 blue, 6 red, 1 green
|
||||
Game 79: 2 blue, 5 green, 4 red; 1 blue, 1 red, 1 green; 1 blue, 5 red, 10 green; 6 red, 3 green, 3 blue; 8 red, 9 green, 6 blue; 7 blue, 6 green, 13 red
|
||||
Game 80: 10 green, 7 blue, 5 red; 5 red, 1 green, 6 blue; 8 blue, 2 red, 8 green
|
||||
Game 81: 3 green, 10 red; 6 blue, 8 green, 14 red; 4 green, 4 blue, 13 red; 5 blue, 11 green, 6 red; 16 red, 8 green, 5 blue; 6 green, 18 red, 6 blue
|
||||
Game 82: 13 red, 1 green, 7 blue; 8 green, 4 blue, 12 red; 18 red, 5 green, 3 blue; 13 red, 4 green, 9 blue
|
||||
Game 83: 1 red, 3 green, 4 blue; 5 blue, 4 green, 1 red; 3 green, 1 red, 12 blue; 4 green, 11 blue
|
||||
Game 84: 3 blue, 10 green, 2 red; 3 red, 8 blue; 11 blue, 12 red, 14 green; 2 red, 11 green, 2 blue
|
||||
Game 85: 8 blue, 2 green, 1 red; 13 blue, 6 red; 3 blue, 5 green
|
||||
Game 86: 16 red, 8 blue; 7 blue; 16 red, 16 blue, 1 green; 15 blue, 11 red; 2 green, 7 red, 5 blue
|
||||
Game 87: 6 green, 9 blue, 4 red; 1 red, 1 green, 4 blue; 5 blue, 13 green, 3 red; 2 green, 4 red; 16 blue, 10 green, 3 red
|
||||
Game 88: 1 blue, 14 red; 14 red, 3 blue, 8 green; 1 blue, 5 green
|
||||
Game 89: 12 green, 14 blue, 3 red; 2 red, 3 blue, 3 green; 2 blue, 8 green; 1 red, 3 green, 15 blue; 3 red, 5 blue
|
||||
Game 90: 3 blue, 17 red, 11 green; 2 red, 2 blue, 7 green; 7 blue; 8 blue, 4 green, 10 red; 1 blue, 4 red
|
||||
Game 91: 10 red, 9 blue, 8 green; 5 blue, 10 red, 2 green; 11 red, 17 green, 7 blue; 12 blue, 16 red, 18 green; 20 green, 5 blue, 15 red
|
||||
Game 92: 1 green, 14 red, 1 blue; 2 blue, 6 green; 9 red, 6 green; 5 blue, 5 red, 2 green; 3 blue, 3 green, 10 red; 5 blue, 1 red
|
||||
Game 93: 10 green, 1 red, 6 blue; 16 red, 5 blue, 2 green; 3 red, 7 green, 11 blue; 12 green, 5 blue, 4 red; 8 green, 7 blue, 10 red; 1 red, 5 blue
|
||||
Game 94: 3 blue, 1 red, 3 green; 1 blue, 4 green, 4 red; 9 green
|
||||
Game 95: 3 green, 5 blue, 9 red; 2 green, 9 red, 2 blue; 12 red, 9 green; 11 green, 9 red, 9 blue; 9 blue, 6 green, 10 red; 13 red, 2 blue, 5 green
|
||||
Game 96: 2 red, 19 blue, 2 green; 10 blue, 1 red, 2 green; 9 blue, 1 red; 2 green, 3 blue; 1 green, 1 red, 11 blue
|
||||
Game 97: 6 green, 7 blue, 5 red; 7 green, 1 red, 11 blue; 6 green, 6 red, 5 blue; 2 red, 9 blue, 1 green
|
||||
Game 98: 5 green, 8 red, 15 blue; 16 green, 9 blue, 8 red; 5 blue, 3 red, 2 green; 13 blue, 12 green, 4 red; 2 red, 15 green, 3 blue; 1 green, 11 blue, 2 red
|
||||
Game 99: 1 green, 7 blue, 6 red; 16 blue, 9 red; 1 green, 17 red, 12 blue; 15 red, 7 blue; 8 blue, 14 red
|
||||
Game 100: 5 blue, 11 red, 6 green; 11 red, 2 blue, 5 green; 6 blue, 6 green; 2 blue, 6 red, 15 green; 7 red, 4 blue, 7 green
|
||||
5
day20/example1
Normal file
5
day20/example1
Normal file
@@ -0,0 +1,5 @@
|
||||
broadcaster -> a, b, c
|
||||
%a -> b
|
||||
%b -> c
|
||||
%c -> inv
|
||||
&inv -> a
|
||||
5
day20/example2
Normal file
5
day20/example2
Normal file
@@ -0,0 +1,5 @@
|
||||
broadcaster -> a
|
||||
%a -> inv, con
|
||||
&inv -> b
|
||||
%b -> con
|
||||
&con -> output
|
||||
142
day20/looping.go
Normal file
142
day20/looping.go
Normal file
@@ -0,0 +1,142 @@
|
||||
package day20
|
||||
|
||||
import (
|
||||
"cmp"
|
||||
"log"
|
||||
"slices"
|
||||
)
|
||||
|
||||
func TransitiveOutputs(from string, allModules map[string]Module, visited map[string]any) map[string]any {
|
||||
// log.Printf("looking for transitive children of %s\n", from)
|
||||
_, alreadyProcessed := visited[from]
|
||||
if alreadyProcessed {
|
||||
return visited
|
||||
}
|
||||
|
||||
module, found := allModules[from]
|
||||
if !found {
|
||||
return visited
|
||||
}
|
||||
|
||||
visited[from] = struct{}{}
|
||||
children := module.Outputs()
|
||||
for _, output := range children {
|
||||
TransitiveOutputs(output, allModules, visited)
|
||||
}
|
||||
|
||||
delete(visited, "th")
|
||||
|
||||
// for key, _ := range visited {
|
||||
// result = append(result, key)
|
||||
// }
|
||||
|
||||
return visited
|
||||
}
|
||||
|
||||
func FindSubGraphLoopLength(subgraph map[string]any, allModules map[string]Module, monitorOutputsOf string) (fromStep, toStep int, monitoredPulses map[int][]PulseType) {
|
||||
step := 1
|
||||
seenSubgraphStates := make(map[string]int)
|
||||
monitoredPulses = make(map[int][]PulseType)
|
||||
for {
|
||||
monitoredPulsesOfTheStep := PropagateButtonPressWithMonitor(allModules, step, monitorOutputsOf)
|
||||
subgraphModules := make(map[string]Module)
|
||||
for key, _ := range subgraph {
|
||||
subgraphModules[key] = allModules[key]
|
||||
}
|
||||
subgraphState := ModulesState(subgraphModules)
|
||||
// log.Printf("looping %d. state is %s", step, subgraphState)
|
||||
|
||||
prevSteps, known := seenSubgraphStates[subgraphState]
|
||||
if known {
|
||||
// log.Printf(">>> searching for loop of %+v", subgraph)
|
||||
log.Printf(">>> found loop from %d to %d. of size %d\n", prevSteps, step - 1, step - prevSteps)
|
||||
return prevSteps, step, monitoredPulses
|
||||
}
|
||||
|
||||
seenSubgraphStates[subgraphState] = step
|
||||
if len(monitoredPulsesOfTheStep) > 0 {
|
||||
monitoredPulses[step] = monitoredPulsesOfTheStep
|
||||
}
|
||||
step++
|
||||
}
|
||||
panic("")
|
||||
}
|
||||
|
||||
// i see lot's of 'LowPulse'
|
||||
// while i want to find steps where all inputs are remembered as High.
|
||||
// so i'm interested in steps with "high" pulses and next steps that make it 'low' after
|
||||
func FilterMonitoredPulses(requestedPulses map[int][]PulseType) {
|
||||
afterHigh := false
|
||||
for step, pulses := range requestedPulses {
|
||||
processedPulses := make([]PulseType, 0)
|
||||
for _, pulse := range pulses {
|
||||
if pulse == HighPulse {
|
||||
processedPulses = append(processedPulses, pulse)
|
||||
afterHigh = true
|
||||
continue
|
||||
}
|
||||
if afterHigh {
|
||||
processedPulses = append(processedPulses, pulse)
|
||||
afterHigh = false
|
||||
}
|
||||
}
|
||||
if len(processedPulses) > 0 {
|
||||
requestedPulses[step] = processedPulses
|
||||
} else {
|
||||
delete(requestedPulses, step)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// loop math
|
||||
// 2023/12/20 12:35:08 >>> searching for loop of sr
|
||||
// 2023/12/20 12:35:08 >>> found loop from 1 to 4028. of size 4028
|
||||
// 2023/12/20 12:35:08 the pulses: +map[4026:[high low]]
|
||||
// 2023/12/20 12:35:08 >>> searching for loop of ch
|
||||
// 2023/12/20 12:35:08 >>> found loop from 0 to 3923. of size 3924
|
||||
// 2023/12/20 12:35:08 the pulses: +map[3817:[high low]]
|
||||
// 2023/12/20 12:35:08 >>> searching for loop of hd
|
||||
// 2023/12/20 12:35:09 >>> found loop from 0 to 3793. of size 3794
|
||||
// 2023/12/20 12:35:09 the pulses: +map[3427:[high low]]
|
||||
// 2023/12/20 12:35:09 >>> searching for loop of bx
|
||||
// 2023/12/20 12:35:09 >>> found loop from 0 to 3739. of size 3740
|
||||
// 2023/12/20 12:35:09 the pulses: +map[3211:[high low]]
|
||||
func CalcCommonStep() int {
|
||||
type LoopInfo struct {
|
||||
loopLength, initialDesiredStep int
|
||||
curStep int
|
||||
}
|
||||
loopA := &LoopInfo{4027, 4026, 4026}
|
||||
loopB := &LoopInfo{3923, 3922, 3922}
|
||||
loopC := &LoopInfo{3793, 3792, 3792}
|
||||
loopD := &LoopInfo{3739, 3211, 3738}
|
||||
|
||||
|
||||
|
||||
// nope they can have different amount of own loops.
|
||||
// so it's 4 unknowns, 5 unknowns.
|
||||
// i could store 4 'steps' and on each iteration increase the smallest one
|
||||
// until they are all equal
|
||||
|
||||
loops := []*LoopInfo{loopA, loopB, loopC, loopD}
|
||||
allSameStep := loopA.curStep == loopB.curStep &&
|
||||
loopB.curStep == loopC.curStep &&
|
||||
loopC.curStep == loopD.curStep
|
||||
|
||||
i := 0
|
||||
|
||||
for !allSameStep {
|
||||
minLoop := slices.MinFunc(loops, func(a *LoopInfo, b *LoopInfo) int {
|
||||
return cmp.Compare(a.curStep, b.curStep)
|
||||
})
|
||||
minLoop.curStep += minLoop.loopLength
|
||||
|
||||
if i % 10000000 == 0 {
|
||||
log.Printf(">> iterations made: %d, min step is %d", i, minLoop.curStep)
|
||||
}
|
||||
|
||||
i++
|
||||
}
|
||||
|
||||
return loopA.curStep
|
||||
}
|
||||
277
day20/modules.go
Normal file
277
day20/modules.go
Normal file
@@ -0,0 +1,277 @@
|
||||
package day20
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"regexp"
|
||||
"slices"
|
||||
"strings"
|
||||
)
|
||||
|
||||
type PulseType int
|
||||
func (pt PulseType)String() string {
|
||||
types := []string{"high", "low"}
|
||||
return types[pt]
|
||||
}
|
||||
|
||||
const (
|
||||
HighPulse PulseType = iota
|
||||
LowPulse
|
||||
)
|
||||
|
||||
type Signal struct {
|
||||
To, From string
|
||||
PulseType PulseType
|
||||
}
|
||||
|
||||
type Module interface {
|
||||
Receive(s Signal) []Signal
|
||||
Outputs() []string
|
||||
StateSnapshot() string
|
||||
MermaidFlow() string
|
||||
}
|
||||
|
||||
// Modules
|
||||
type FlipFlop struct {
|
||||
Name string
|
||||
OutputNames []string
|
||||
IsOn bool
|
||||
}
|
||||
|
||||
// ignores HighPulse
|
||||
// on LowPulse - toggle state and send signal
|
||||
func (ff *FlipFlop)Receive(s Signal) []Signal {
|
||||
if s.PulseType == HighPulse {
|
||||
return []Signal{}
|
||||
}
|
||||
|
||||
ff.IsOn = !ff.IsOn
|
||||
outTemplate := Signal{
|
||||
From: ff.Name,
|
||||
}
|
||||
if ff.IsOn {
|
||||
outTemplate.PulseType = HighPulse
|
||||
} else {
|
||||
outTemplate.PulseType = LowPulse
|
||||
}
|
||||
|
||||
result := make([]Signal, len(ff.OutputNames))
|
||||
for i, outName := range ff.OutputNames {
|
||||
out := outTemplate
|
||||
out.To = outName
|
||||
result[i] = out
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func (ff *FlipFlop)Outputs() []string {
|
||||
return ff.OutputNames
|
||||
}
|
||||
|
||||
func (ff *FlipFlop)String() string {
|
||||
return fmt.Sprintf("[flip-flop '%s' (on: %t) -> %s]", ff.Name, ff.IsOn, ff.OutputNames)
|
||||
}
|
||||
|
||||
func (ff *FlipFlop)StateSnapshot() string {
|
||||
return ff.String()
|
||||
}
|
||||
|
||||
func (ff *FlipFlop)MermaidFlow() string {
|
||||
result := "\n"
|
||||
for _, toName := range ff.OutputNames {
|
||||
result += fmt.Sprintf("%s --> %s\n", ff.Name, toName)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func IsLineFlipFlop(line string) bool {
|
||||
return strings.HasPrefix(line, "%")
|
||||
}
|
||||
|
||||
func ParseFlipFlop(line string) (result FlipFlop) {
|
||||
re := regexp.MustCompile(`%(?P<NAME>\D+) -> (?P<OUTPUTS>.+)`)
|
||||
matches := re.FindStringSubmatch(line)
|
||||
|
||||
// log.Printf("matching %s getting '%s' and '%s'\n", line, matches[1], matches[2])
|
||||
result.Name = matches[1]
|
||||
result.OutputNames = strings.Split(matches[2], ", ")
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
type Broadcast struct {
|
||||
OutputNames []string
|
||||
}
|
||||
|
||||
// send same pulse to all outputs
|
||||
func (b *Broadcast)Receive(s Signal) (result []Signal) {
|
||||
signalTemplate := Signal{From: "broadcast", PulseType: s.PulseType}
|
||||
for _, out := range b.OutputNames {
|
||||
outSignal := signalTemplate
|
||||
outSignal.To = out
|
||||
result = append(result, outSignal)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (b *Broadcast)Outputs() []string {
|
||||
return b.OutputNames
|
||||
}
|
||||
|
||||
func (b *Broadcast)String() string {
|
||||
return fmt.Sprintf("[broadcast -> %+v]", b.OutputNames)
|
||||
}
|
||||
func (b *Broadcast)StateSnapshot() string {
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func (b *Broadcast)MermaidFlow() string {
|
||||
result := "\n"
|
||||
for _, toName := range b.OutputNames {
|
||||
result += fmt.Sprintf("%s --> %s\n", "broadcast", toName)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func IsLineBroadcast(line string) bool {
|
||||
return strings.HasPrefix(line, "broadcaster")
|
||||
}
|
||||
|
||||
func ParseBroadcast(line string) (result Broadcast) {
|
||||
re := regexp.MustCompile(`broadcaster -> (?P<OUTPUTS>.+)`)
|
||||
matches := re.FindStringSubmatch(line)
|
||||
|
||||
result.OutputNames = strings.Split(matches[1], ", ")
|
||||
return
|
||||
}
|
||||
|
||||
type Conjunction struct {
|
||||
Name string
|
||||
OutputNames []string
|
||||
MostRecentPulseFromInputIsHigh map[string]bool
|
||||
}
|
||||
|
||||
// remembers last signal type from all inputs (initial default is Low)
|
||||
// when receiving pulse, first update memory for that input
|
||||
// then if for all inputs remembered is high - send LowPulse
|
||||
// otherwise if some remembers are low - send HighPulse
|
||||
func (c *Conjunction)Receive(s Signal) (result []Signal) {
|
||||
c.MostRecentPulseFromInputIsHigh[s.From] = s.PulseType == HighPulse
|
||||
|
||||
allHigh := true
|
||||
for _, latestImpulseHight := range c.MostRecentPulseFromInputIsHigh {
|
||||
if !latestImpulseHight {
|
||||
allHigh = false
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
outTemplate := Signal{From: c.Name}
|
||||
if allHigh {
|
||||
outTemplate.PulseType = LowPulse
|
||||
} else {
|
||||
outTemplate.PulseType = HighPulse
|
||||
}
|
||||
|
||||
for _, outName := range c.OutputNames {
|
||||
outSignal := outTemplate
|
||||
outSignal.To = outName
|
||||
result = append(result, outSignal)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (c *Conjunction)Outputs() []string {
|
||||
return c.OutputNames
|
||||
}
|
||||
|
||||
func (c *Conjunction)RegisterInputs(allModules map[string]Module) {
|
||||
for name, module := range allModules {
|
||||
if slices.Contains( module.Outputs(), c.Name) {
|
||||
c.MostRecentPulseFromInputIsHigh[name] = false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (c *Conjunction)String() string {
|
||||
return fmt.Sprintf("[conjunction '%s' -> %+v]", c.Name, c.OutputNames)
|
||||
}
|
||||
|
||||
func (c *Conjunction)StateSnapshot() string {
|
||||
return fmt.Sprintf("[conjunction '%s' -> %+v]", c.Name, c.MostRecentPulseFromInputIsHigh)
|
||||
}
|
||||
|
||||
func (c *Conjunction)MermaidFlow() string {
|
||||
result := "\n"
|
||||
result += fmt.Sprintf("%s{%s}\n", c.Name, c.Name)
|
||||
for _, toName := range c.OutputNames {
|
||||
result += fmt.Sprintf("%s --> %s\n", c.Name, toName)
|
||||
}
|
||||
return result
|
||||
|
||||
}
|
||||
|
||||
func IsLineConjunction(line string) bool {
|
||||
return strings.HasPrefix(line, "&")
|
||||
}
|
||||
|
||||
func ParseConjunction(line string) (result Conjunction) {
|
||||
re := regexp.MustCompile(`&(?P<NAME>\D+) -> (?P<OUTPUTS>.+)`)
|
||||
matches := re.FindStringSubmatch(line)
|
||||
|
||||
// log.Printf("matching %s getting '%s' and '%s'\n", line, matches[1], matches[2])
|
||||
result.Name = matches[1]
|
||||
result.OutputNames = strings.Split(matches[2], ", ")
|
||||
|
||||
result.MostRecentPulseFromInputIsHigh = map[string]bool{}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
type Button struct {}
|
||||
|
||||
func (b *Button)Receive(s Signal) []Signal {
|
||||
return []Signal{
|
||||
{ To: "broadcast", From: "button", PulseType: LowPulse },
|
||||
}
|
||||
}
|
||||
|
||||
func (b *Button)Outputs() []string {
|
||||
return []string{"broadcast"}
|
||||
}
|
||||
|
||||
func (b *Button)String() string {
|
||||
return "[button]"
|
||||
}
|
||||
|
||||
func (b *Button)StateSnapshot() string {
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func (b *Button)MermaidFlow() string {
|
||||
return "button --> broadcast\n"
|
||||
}
|
||||
|
||||
type Output struct {}
|
||||
|
||||
func (o *Output)Receive(s Signal) []Signal {
|
||||
// log.Print("Outut received signal: ", s)
|
||||
return []Signal{}
|
||||
}
|
||||
|
||||
func (o *Output)Outputs() []string {
|
||||
return []string{}
|
||||
}
|
||||
|
||||
func (o *Output)String() string {
|
||||
return "[output]"
|
||||
}
|
||||
|
||||
func (o *Output)StateSnapshot() string {
|
||||
return o.String()
|
||||
}
|
||||
|
||||
func (o *Output)MermaidFlow() string {
|
||||
return ""
|
||||
}
|
||||
61
day20/modules_test.go
Normal file
61
day20/modules_test.go
Normal file
@@ -0,0 +1,61 @@
|
||||
package day20
|
||||
|
||||
import (
|
||||
"slices"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestParseFlipFlop(t *testing.T) {
|
||||
flipFlopLine := "%a -> inv, con"
|
||||
if !IsLineFlipFlop(flipFlopLine) {
|
||||
t.Errorf("line '%s' should be flip flop\n", flipFlopLine)
|
||||
}
|
||||
module := ParseFlipFlop(flipFlopLine)
|
||||
t.Logf("got module %+v\n", module)
|
||||
}
|
||||
|
||||
func TestParseBroadcast(t *testing.T) {
|
||||
broadcastLine := "broadcaster -> a, b, c"
|
||||
if !IsLineBroadcast(broadcastLine) {
|
||||
t.Error("expected line to pass broadcast check")
|
||||
}
|
||||
module := ParseBroadcast(broadcastLine)
|
||||
t.Logf("got module %+v\n", module)
|
||||
|
||||
if !slices.Equal(module.OutputNames, []string{"a", "b", "c"}) {
|
||||
t.Errorf("got unexpected outputs: %+v\n", module.OutputNames)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseConjunction(t *testing.T) {
|
||||
conjunctionLine := "&inv -> b"
|
||||
if !IsLineConjunction(conjunctionLine) {
|
||||
t.Errorf("line '%s' should be flip flop\n", conjunctionLine)
|
||||
}
|
||||
module := ParseConjunction(conjunctionLine)
|
||||
t.Logf("got module %+v\n", module)
|
||||
moduleAsExpected := module.Name != "inv" || slices.Equal(module.OutputNames, []string{"b"})
|
||||
if !moduleAsExpected {
|
||||
t.Fail()
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadManyModules(t *testing.T) {
|
||||
filename := "example1"
|
||||
modules := ReadModules(filename)
|
||||
t.Logf("> read example1:\n%+v", modules)
|
||||
|
||||
filename2 := "example2"
|
||||
modules2 := ReadModules(filename2)
|
||||
t.Logf("> read example2:\n%+v", modules2)
|
||||
}
|
||||
|
||||
func TestConjunctionRegisterInputs(t *testing.T) {
|
||||
filename := "example2"
|
||||
modules := ReadModules(filename)
|
||||
|
||||
conjunctionInv := modules["inv"].(*Conjunction)
|
||||
conjunctionInv.RegisterInputs(modules)
|
||||
|
||||
t.Logf("after registering inputs on $inv : %+v", conjunctionInv.MostRecentPulseFromInputIsHigh)
|
||||
}
|
||||
178
day20/my-mermaid.mmd
Normal file
178
day20/my-mermaid.mmd
Normal file
@@ -0,0 +1,178 @@
|
||||
flowchart LR
|
||||
|
||||
gm --> tj
|
||||
gm --> gf
|
||||
|
||||
nn --> ff
|
||||
nn --> db
|
||||
|
||||
broadcast --> sr
|
||||
broadcast --> ch
|
||||
broadcast --> hd
|
||||
broadcast --> bx
|
||||
|
||||
qm --> gm
|
||||
|
||||
xm --> db
|
||||
|
||||
pf --> qx
|
||||
|
||||
ln --> gf
|
||||
ln --> qq
|
||||
|
||||
pm --> vc
|
||||
pm --> qk
|
||||
|
||||
rz --> qx
|
||||
rz --> cv
|
||||
|
||||
gf{gf}
|
||||
gf --> fj
|
||||
gf --> qm
|
||||
gf --> xn
|
||||
gf --> sr
|
||||
|
||||
fn --> pr
|
||||
fn --> gf
|
||||
|
||||
lc --> gf
|
||||
lc --> fn
|
||||
|
||||
sr --> gf
|
||||
sr --> vl
|
||||
|
||||
jz --> qj
|
||||
jz --> db
|
||||
|
||||
th{th}
|
||||
th --> rx
|
||||
|
||||
cb --> kt
|
||||
|
||||
bf --> qx
|
||||
bf --> pf
|
||||
|
||||
qj --> xm
|
||||
qj --> db
|
||||
|
||||
ch --> db
|
||||
ch --> mc
|
||||
|
||||
ff --> pl
|
||||
|
||||
pr --> gf
|
||||
|
||||
zd --> ln
|
||||
zd --> gf
|
||||
|
||||
qn{qn}
|
||||
qn --> th
|
||||
|
||||
kt --> qx
|
||||
kt --> rz
|
||||
|
||||
fj --> zd
|
||||
|
||||
tj --> lc
|
||||
tj --> gf
|
||||
|
||||
bx --> qx
|
||||
bx --> qp
|
||||
|
||||
cr --> gx
|
||||
cr --> vc
|
||||
|
||||
vm --> cl
|
||||
|
||||
nh --> hv
|
||||
|
||||
qk --> vc
|
||||
|
||||
jd --> qx
|
||||
jd --> vm
|
||||
|
||||
hd --> vc
|
||||
hd --> nh
|
||||
|
||||
sf --> bp
|
||||
|
||||
cl --> qx
|
||||
cl --> bf
|
||||
|
||||
vc{vc}
|
||||
vc --> lr
|
||||
vc --> hd
|
||||
vc --> ks
|
||||
vc --> qn
|
||||
vc --> gx
|
||||
vc --> nh
|
||||
vc --> hv
|
||||
|
||||
bp --> db
|
||||
bp --> jz
|
||||
|
||||
cc --> nn
|
||||
|
||||
lr --> sb
|
||||
|
||||
qq --> qm
|
||||
qq --> gf
|
||||
|
||||
db{db}
|
||||
db --> ff
|
||||
db --> ds
|
||||
db --> sf
|
||||
db --> ch
|
||||
db --> cc
|
||||
db --> xf
|
||||
|
||||
vl --> gf
|
||||
vl --> fj
|
||||
|
||||
ks --> vz
|
||||
|
||||
xn{xn}
|
||||
xn --> th
|
||||
|
||||
xf{xf}
|
||||
xf --> th
|
||||
|
||||
pl --> sf
|
||||
pl --> db
|
||||
|
||||
zl{zl}
|
||||
zl --> th
|
||||
|
||||
vz --> cr
|
||||
vz --> vc
|
||||
|
||||
gx --> cd
|
||||
|
||||
mc --> ds
|
||||
mc --> db
|
||||
|
||||
qp --> cb
|
||||
qp --> qx
|
||||
button --> broadcast
|
||||
|
||||
cv --> xz
|
||||
|
||||
xz --> jd
|
||||
|
||||
qx{qx}
|
||||
qx --> cb
|
||||
qx --> cv
|
||||
qx --> bx
|
||||
qx --> xz
|
||||
qx --> vm
|
||||
qx --> zl
|
||||
|
||||
hv --> lr
|
||||
|
||||
cd --> pm
|
||||
cd --> vc
|
||||
|
||||
sb --> ks
|
||||
sb --> vc
|
||||
|
||||
ds --> cc
|
||||
1
day20/my-mermaid.mmd.svg
Normal file
1
day20/my-mermaid.mmd.svg
Normal file
File diff suppressed because one or more lines are too long
|
After Width: | Height: | Size: 141 KiB |
180
day20/notes.org
Normal file
180
day20/notes.org
Normal file
@@ -0,0 +1,180 @@
|
||||
#+title: Notes
|
||||
* ok. only thought i had was to simulate the thing
|
||||
|
||||
have single executor, that takes head of the queue,
|
||||
signals would be (to, from, type)
|
||||
|
||||
take 'to' out of the map, call it's 'process(from, type)'
|
||||
|
||||
and different types of executors would implement this differently.
|
||||
and return a slice of new signals in order, to be appended.
|
||||
|
||||
if queue is empty - the single button press is propagated and all is well.
|
||||
|
||||
we will take snapshot of state, String() repr of all executors should be enough,
|
||||
and save amount of signals sent so far
|
||||
* also, i suppose i'd want to have entry points for fiddling with single executors to be test cases.
|
||||
* modules to implement
|
||||
** DONE Broadcast
|
||||
** DONE Flip-Flop
|
||||
** DONE Conjunction
|
||||
** DONE Button
|
||||
* i guess each module could test if string is it's a representation of this type
|
||||
and would be able to parse it? into it's own struct?
|
||||
well, those are just functions, since only methods are associated, so ok
|
||||
* how do i run single tests?
|
||||
** running tests from the module
|
||||
#+begin_src bash
|
||||
go test sunshine.industries/aoc2023/day20 -v
|
||||
#+end_src
|
||||
|
||||
have file with `_test.go` and `func Test...(t *testing.T) {}` name
|
||||
** running single test
|
||||
#+begin_src bash
|
||||
go test sunshine.industries/aoc2023/day20 -v -run TestParseFlipFlop
|
||||
#+end_src
|
||||
* yikes. if i don't know the 'inputs' to the conjunction, don't know how to check for 'all high'
|
||||
let's add registering after the map is read.
|
||||
* well. for part 2 brute force doesn't work.
|
||||
how could i examine inputs to the 'rx' to see when it will receive 'low'?
|
||||
|
||||
i suppose inputs could be on prime cycles, which would align to all required values only on a very big step?
|
||||
|
||||
let's do some kind of visualiztion?
|
||||
|
||||
how would i do graphql or mermaidjs?
|
||||
|
||||
flowchard in mermaid should be it
|
||||
|
||||
go run . > day20/my-mermaid.mmd
|
||||
* so, looking at the thingy.
|
||||
rx is produced by &th
|
||||
which has inputs of
|
||||
11:&xn -> th
|
||||
14:&qn -> th
|
||||
16:&xf -> th
|
||||
32:&zl -> th
|
||||
|
||||
|
||||
for rx to receive a low pulse.
|
||||
&th should receive High Pulse, while all other inputs alse remembered as high.
|
||||
|
||||
this is not too easy.
|
||||
but first let's check if loops over
|
||||
- xn
|
||||
- qn
|
||||
- xh
|
||||
- zl
|
||||
|
||||
are manageable.
|
||||
|
||||
well.
|
||||
i'll need to what?
|
||||
not only track the inputs of the th.
|
||||
but state of the 'subloop'
|
||||
and they are separate
|
||||
|
||||
is there an easy way to collect the names from each subloop?
|
||||
i guess i could write a collect.
|
||||
|
||||
from each of outputs of 'broadcast'
|
||||
|
||||
then have a funciton that checks loop size of each subgraphs
|
||||
|
||||
but i will also need to figure out on which steps output of the loop is remembered as High \ Low
|
||||
|
||||
let's start with loop size? and modify things if need be
|
||||
** starting points of loops:
|
||||
children of the broadcast:
|
||||
broadcaster -> sr, ch, hd, bx
|
||||
|
||||
sr, ch, hd, bx
|
||||
** ok. some data here
|
||||
2023/12/20 12:05:06 >>> searching for loop of sr
|
||||
2023/12/20 12:05:06 >>> found loop from 1 to 4028. of size 4028
|
||||
2023/12/20 12:05:06 >>> searching for loop of ch
|
||||
2023/12/20 12:05:06 >>> found loop from 0 to 3923. of size 3924
|
||||
2023/12/20 12:05:06 >>> searching for loop of hd
|
||||
2023/12/20 12:05:06 >>> found loop from 0 to 3793. of size 3794
|
||||
2023/12/20 12:05:06 >>> searching for loop of bx
|
||||
2023/12/20 12:05:07 >>> found loop from 0 to 3739. of size 3740
|
||||
|
||||
one of these guys starts from 1, not from 0.
|
||||
this is unusual, but OK
|
||||
|
||||
now, i want to figure out what are steps where output for the each cycle is 'considered as saved as 1'
|
||||
|
||||
i guess i could just directly probe the
|
||||
`th`
|
||||
|
||||
on each step up to 4028
|
||||
|
||||
but also, if the signallings from those are rare - would be eaiser to collect steps of each signal.
|
||||
** ok. i collected 'monitored pulses' and i see lots of 'Low'
|
||||
what i want is all "high" and first low after those.
|
||||
** oh wow, this crap
|
||||
2023/12/20 12:30:05 >>> searching for loop of ch
|
||||
2023/12/20 12:30:05 >>> found loop from 1 to 3924. of size 3924
|
||||
2023/12/20 12:30:05 the pulses
|
||||
+map[3922:[high low]]
|
||||
2023/12/20 12:30:05 >>> searching for loop of hd
|
||||
2023/12/20 12:30:05 >>> found loop from 0 to 3793. of size 3794
|
||||
2023/12/20 12:30:05 the pulses
|
||||
+map[3661:[high low]]
|
||||
2023/12/20 12:30:05 >>> searching for loop of bx
|
||||
2023/12/20 12:30:05 >>> found loop from 0 to 3739. of size 3740
|
||||
2023/12/20 12:30:05 the pulses
|
||||
+map[3499:[high low]]
|
||||
2023/12/20 12:30:05 >>> searching for loop of sr
|
||||
2023/12/20 12:30:05 >>> found loop from 0 to 4027. of size 4028
|
||||
2023/12/20 12:30:05 the pulses
|
||||
+map[624:[high low]]
|
||||
*** but at least these 'high low' are all on same step.
|
||||
now with info on loop start, place of pulse in the loop and length of loops,
|
||||
what is the step so that those [high low] occur on same step num?
|
||||
*** math should be:
|
||||
3922 + LOOP_N * (LOOP_LEN)
|
||||
** wait i now get different output?
|
||||
2023/12/20 12:57:50 >>> searching for loop of bx
|
||||
2023/12/20 12:57:50 >>> found loop from 1 to 3739. of size 3739
|
||||
2023/12/20 12:57:50 the pulses: +map[3738:[high low]]
|
||||
2023/12/20 12:57:50 >>> searching for loop of sr
|
||||
2023/12/20 12:57:50 >>> found loop from 0 to 4026. of size 4027
|
||||
2023/12/20 12:57:50 the pulses: +map[286:[high low]]
|
||||
2023/12/20 12:57:50 >>> searching for loop of ch
|
||||
2023/12/20 12:57:50 >>> found loop from 0 to 3922. of size 3923
|
||||
2023/12/20 12:57:50 the pulses: +map[78:[high low]]
|
||||
2023/12/20 12:57:50 >>> searching for loop of hd
|
||||
2023/12/20 12:57:51 >>> found loop from 0 to 3792. of size 3793
|
||||
2023/12/20 12:57:51 the pulses: +map[3481:[high low]]
|
||||
** why is my filtering unstable?
|
||||
** let's check for single loop?
|
||||
** yikes. but maybe
|
||||
2023/12/20 13:08:52 >>> searching for loop of sr
|
||||
2023/12/20 13:08:52 >>> found loop from 2 to 4028. of size 4027
|
||||
2023/12/20 13:08:52 the pulses: +map[4027:[high low]]
|
||||
|
||||
2023/12/20 13:09:23 >>> searching for loop of ch
|
||||
2023/12/20 13:09:23 >>> found loop from 2 to 3924. of size 3923
|
||||
2023/12/20 13:09:23 the pulses: +map[3923:[high low]]
|
||||
|
||||
2023/12/20 13:09:37 >>> searching for loop of hd
|
||||
2023/12/20 13:09:37 >>> found loop from 2 to 3794. of size 3793
|
||||
2023/12/20 13:09:37 the pulses: +map[3793:[high low]]
|
||||
|
||||
2023/12/20 13:09:49 >>> searching for loop of bx
|
||||
2023/12/20 13:09:49 >>> found loop from 2 to 3740. of size 3739
|
||||
2023/12/20 13:09:49 the pulses: +map[3739:[high low]]
|
||||
|
||||
all loops start from same plase.
|
||||
i could just do 1 press. then the loop starts. and all of them have [high low] on last place.
|
||||
so it's going to be 1 + least common ...
|
||||
** aaand, i just did least common multiple of the cycle lenghts.
|
||||
and i didn't even added 1. which is strange. i guess i did have 'off-by-one'
|
||||
crap
|
||||
*** yeah. i can start from step 1. but i need to first update State then check for previous
|
||||
|
||||
all loops are from step 1.
|
||||
|
||||
it's just for some reason code was unstable when i was searching for all
|
||||
*** answer is 224046542165867
|
||||
105
day20/propagation_test.go
Normal file
105
day20/propagation_test.go
Normal file
@@ -0,0 +1,105 @@
|
||||
package day20
|
||||
|
||||
import (
|
||||
"log"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestPropagateButtonPressExample1(t *testing.T) {
|
||||
filename := "example1"
|
||||
modules := ReadModules(filename)
|
||||
t.Log("got modules:\n", modules)
|
||||
|
||||
low, high := PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("modules after single button press:\n", modules)
|
||||
|
||||
success := low == 8 && high == 4
|
||||
if !success {
|
||||
t.Errorf("expected low 8 got %d, high 4 got %d", low, high)
|
||||
}
|
||||
}
|
||||
|
||||
func TestPropagateButtonPressExample2(t *testing.T) {
|
||||
filename := "example2"
|
||||
modules := ReadModules(filename)
|
||||
t.Log("got modules:\n", modules)
|
||||
InitStuffs(modules)
|
||||
|
||||
low, high := PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("modules after single button press:\n", modules)
|
||||
|
||||
success := low == 4 && high == 4
|
||||
if !success {
|
||||
t.Errorf("expected low 4 got %d, high 4 got %d", low, high)
|
||||
}
|
||||
}
|
||||
|
||||
func TestPropagateButtonPressExample2FourSteps(t *testing.T) {
|
||||
filename := "example2"
|
||||
modules := ReadModules(filename)
|
||||
t.Log("got modules:\n", modules)
|
||||
InitStuffs(modules)
|
||||
|
||||
initialModulesState := ModulesState(modules)
|
||||
|
||||
low, high := PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("#1 button press:\n", modules)
|
||||
success := low == 4 && high == 4
|
||||
if !success {
|
||||
t.Errorf("expected low 4 got %d, high 4 got %d", low, high)
|
||||
}
|
||||
|
||||
low, high = PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("#2 button press:\n", modules)
|
||||
success = low == 4 && high == 2
|
||||
if !success {
|
||||
t.Errorf("expected low 4 got %d, high 2 got %d", low, high)
|
||||
}
|
||||
secondState := ModulesState(modules)
|
||||
if initialModulesState == secondState {
|
||||
t.Error("initial state should be different from second")
|
||||
}
|
||||
|
||||
low, high = PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("#3 button press:\n", modules)
|
||||
success = low == 5 && high == 3
|
||||
if !success {
|
||||
t.Errorf("expected low 5 got %d, high 3 got %d", low, high)
|
||||
}
|
||||
thirdState := ModulesState(modules)
|
||||
if initialModulesState == thirdState {
|
||||
t.Error("initial state should be different from third")
|
||||
}
|
||||
|
||||
low, high = PropagateButtonPress(modules, 0)
|
||||
t.Logf("got low %d and high %d\n", low, high)
|
||||
t.Log("#4 button press:\n", modules)
|
||||
success = low == 4 && high == 2
|
||||
if !success {
|
||||
t.Errorf("expected low 4 got %d, high 2 got %d", low, high)
|
||||
}
|
||||
|
||||
lastState := ModulesState(modules)
|
||||
|
||||
log.Print("initial modules state:\n", initialModulesState)
|
||||
log.Print("after 4 steps modules state:\n", lastState)
|
||||
if initialModulesState != lastState {
|
||||
t.Error("expected state to be same after 4 steps for example 2")
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestExample1TheQuestion(t *testing.T) {
|
||||
filename := "example1"
|
||||
modules := ReadModules(filename)
|
||||
InitStuffs(modules)
|
||||
|
||||
low, high := Count10000ButtonPresses(modules)
|
||||
t.Log("got low and high: ", low, high)
|
||||
t.Log("response is: ", low * high)
|
||||
}
|
||||
241
day20/pulsePropagation.go
Normal file
241
day20/pulsePropagation.go
Normal file
@@ -0,0 +1,241 @@
|
||||
package day20
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
// fmt.Println("hello from dya 20")
|
||||
|
||||
filename := "day20/input"
|
||||
modules := ReadModules(filename)
|
||||
InitStuffs(modules)
|
||||
// log.Print("got modules:\n", modules)
|
||||
|
||||
// var low, high int
|
||||
// low, high = Count10000ButtonPresses(modules)
|
||||
// log.Printf("got low %d and high %d\n", low, high)
|
||||
|
||||
// CheckSubgraphsStuff(filename)
|
||||
fmt.Print( AllMermaidFlowChard(modules) )
|
||||
|
||||
var result int
|
||||
// result = CalcCommonStep()
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func CheckSubgraphsStuff(filename string) {
|
||||
// loopStarts := allModules["broadcast"].Outputs()
|
||||
|
||||
// loop start and loop sink
|
||||
loopItems := map[string]string {
|
||||
"sr": "xn",
|
||||
"ch": "xf",
|
||||
"hd": "qn",
|
||||
"bx": "zl",
|
||||
}
|
||||
|
||||
for start, end := range loopItems {
|
||||
allModules := ReadModules(filename)
|
||||
InitStuffs(allModules)
|
||||
|
||||
log.Printf(">>> searching for loop of %s", start)
|
||||
themap := make(map[string]any)
|
||||
loopModules := TransitiveOutputs(start, allModules, themap)
|
||||
// i think my bug is not to reset state of `allModules`
|
||||
_, _, requestedPulses := FindSubGraphLoopLength(loopModules, allModules, end)
|
||||
FilterMonitoredPulses(requestedPulses)
|
||||
log.Printf("the pulses: +%v", requestedPulses)
|
||||
}
|
||||
|
||||
// yeah. and now all cycles start from 1 (first button press)
|
||||
// and then they emit the [high low] on last step of their cycle
|
||||
// so just LCM of these all
|
||||
}
|
||||
|
||||
func Count10000ButtonPresses(modules map[string]Module) (lowSignalsCount, highSignalsCount int) {
|
||||
count := 1000
|
||||
type counts struct {
|
||||
low, high int
|
||||
step int
|
||||
}
|
||||
countsAfterState := make(map[string]counts)
|
||||
// after each button press check if reached already known state - cycle is present.
|
||||
// then calculate amount of signals before the loop - how much was on that previous state.
|
||||
// then diff - how much added after the loop
|
||||
|
||||
// for now let's just print the info on loop
|
||||
|
||||
for i := 0; i < count; i++ {
|
||||
if i % 10000 == 0 {
|
||||
log.Println("done button presses: ", i)
|
||||
}
|
||||
stepLow, stepHigh := PropagateButtonPress(modules, i)
|
||||
lowSignalsCount += stepLow
|
||||
highSignalsCount += stepHigh
|
||||
// log.Printf("after step %d low is %d and high is %d", i, lowSignalsCount, highSignalsCount)
|
||||
state := ModulesState(modules)
|
||||
|
||||
prevCounts, found := countsAfterState[state]
|
||||
if found {
|
||||
loopLen := i - prevCounts.step
|
||||
|
||||
log.Printf(">>> found loop. from step %d to step %d. of len %d",
|
||||
prevCounts.step, i, loopLen)
|
||||
|
||||
multiplication := count / loopLen
|
||||
|
||||
lowCountInCycle := lowSignalsCount - prevCounts.low
|
||||
highCountInCycle := highSignalsCount - prevCounts.high
|
||||
|
||||
lowSignalsCount = lowCountInCycle * multiplication
|
||||
highSignalsCount = highCountInCycle * multiplication
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
countsAfterState[state] = counts{stepLow, stepHigh, i}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func PropagateButtonPress(modules map[string]Module, i int) (lowSignalsCount, highSignalsCount int) {
|
||||
signals := []Signal{{From: "button", To: "broadcast", PulseType: LowPulse}}
|
||||
lowSignalsCount += 1
|
||||
|
||||
for len(signals) > 0 {
|
||||
curSignal := signals[0]
|
||||
signals = signals[1:]
|
||||
|
||||
// log.Printf("%s -%s-> %s", curSignal.From, curSignal.PulseType, curSignal.To)
|
||||
|
||||
receivingModule, found := modules[curSignal.To]
|
||||
if !found {
|
||||
// log.Print(fmt.Sprintf("signal %+v can't find it's recepient\n", curSignal))
|
||||
if curSignal.To == "rx" && curSignal.PulseType == LowPulse {
|
||||
panic(fmt.Sprintf("getting low signal to rx, on step %d", i))
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
newSignals := receivingModule.Receive(curSignal)
|
||||
|
||||
// all newSignals will have same type
|
||||
newSignalsAmount := len(newSignals)
|
||||
if newSignalsAmount > 0 {
|
||||
signals = append(signals, newSignals...)
|
||||
someNewSignal := newSignals[0]
|
||||
if someNewSignal.PulseType == HighPulse {
|
||||
highSignalsCount += newSignalsAmount
|
||||
} else {
|
||||
lowSignalsCount += newSignalsAmount
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func PropagateButtonPressWithMonitor(modules map[string]Module, i int, monitorAllOutputsOf string) []PulseType {
|
||||
result := make([]PulseType, 0)
|
||||
|
||||
signals := []Signal{{From: "button", To: "broadcast", PulseType: LowPulse}}
|
||||
|
||||
for len(signals) > 0 {
|
||||
curSignal := signals[0]
|
||||
signals = signals[1:]
|
||||
|
||||
if curSignal.From == monitorAllOutputsOf {
|
||||
result = append(result, curSignal.PulseType)
|
||||
}
|
||||
|
||||
// log.Printf("%s -%s-> %s", curSignal.From, curSignal.PulseType, curSignal.To)
|
||||
|
||||
receivingModule, found := modules[curSignal.To]
|
||||
if !found {
|
||||
// log.Print(fmt.Sprintf("signal %+v can't find it's recepient\n", curSignal))
|
||||
if curSignal.To == "rx" && curSignal.PulseType == LowPulse {
|
||||
panic(fmt.Sprintf("getting low signal to rx, on step %d", i))
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
newSignals := receivingModule.Receive(curSignal)
|
||||
|
||||
// all newSignals will have same type
|
||||
newSignalsAmount := len(newSignals)
|
||||
if newSignalsAmount > 0 {
|
||||
signals = append(signals, newSignals...)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
// process sends single `low pulse` directly to "broadcast"
|
||||
|
||||
func ReadModules(filename string) map[string]Module {
|
||||
result := make(map[string]Module)
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(fmt.Sprint("error reading file: ", filename))
|
||||
}
|
||||
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
switch {
|
||||
case IsLineBroadcast(line):
|
||||
parsed := ParseBroadcast(line)
|
||||
result["broadcast"] = &parsed
|
||||
case IsLineFlipFlop(line):
|
||||
parsed := ParseFlipFlop(line)
|
||||
result[parsed.Name] = &parsed
|
||||
case IsLineConjunction(line):
|
||||
parsed := ParseConjunction(line)
|
||||
result[parsed.Name] = &parsed
|
||||
}
|
||||
|
||||
// log.Println(line)
|
||||
}
|
||||
|
||||
buttonModule := Button{}
|
||||
result["button"] = &buttonModule
|
||||
outputModule := Output{}
|
||||
result["output"] = &outputModule
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func InitStuffs(allModules map[string]Module) {
|
||||
for _, module := range allModules {
|
||||
if conjunction, ok := module.(*Conjunction); ok {
|
||||
conjunction.RegisterInputs(allModules)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func ModulesState(modules map[string]Module) string {
|
||||
// relying on printing of map values to be ordered by key
|
||||
// https://stackoverflow.com/a/54524991/2788805
|
||||
states := make(map[string]string)
|
||||
for name, module := range modules {
|
||||
states[name] = module.StateSnapshot()
|
||||
}
|
||||
|
||||
return fmt.Sprint(states)
|
||||
}
|
||||
|
||||
func AllMermaidFlowChard(allModules map[string]Module) (result string) {
|
||||
result = "flowchart TD\n"
|
||||
for _, module := range allModules {
|
||||
result += module.MermaidFlow()
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
11
day21/example
Normal file
11
day21/example
Normal file
@@ -0,0 +1,11 @@
|
||||
...........
|
||||
.....###.#.
|
||||
.###.##..#.
|
||||
..#.#...#..
|
||||
....#.#....
|
||||
.##..S####.
|
||||
.##..#...#.
|
||||
.......##..
|
||||
.##.#.####.
|
||||
.##..##.##.
|
||||
...........
|
||||
11
day21/example1
Normal file
11
day21/example1
Normal file
@@ -0,0 +1,11 @@
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
.....S.....
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
...........
|
||||
105
day21/notes.org
Normal file
105
day21/notes.org
Normal file
@@ -0,0 +1,105 @@
|
||||
#+title: Notes
|
||||
* part 1
|
||||
so we aren't looking for minimal distance.
|
||||
but all plots which are end to any path of length 'steps left'
|
||||
|
||||
so, i have to follow all possible paths to the end?
|
||||
or. length of 6 and all even - because i could be doing <- ->
|
||||
but i could be doing loop around that would increase path len by odd number
|
||||
|
||||
let's just make direct recursive thing.
|
||||
create set of all reachable by n,
|
||||
* oh, the part 2.
|
||||
i suppose this 'infinite' garden could be managed with my 'neighbors' work with 'out of field'
|
||||
fairly easy
|
||||
but what about sizes of the maps? are we releasing maps of previous iterations?
|
||||
|
||||
maybe if i directly pass references to prev and current,
|
||||
and manually set 'prev' to target new it will be collected?
|
||||
|
||||
and then elements after these steps <em>26501365</em> would fit into memory?
|
||||
** i guess maybe it would help if i had 'fully saturated' field
|
||||
as my minimal 'skipping' thing
|
||||
** so. store FieldCoord(fieldRow, fieldCol) for fields which were fully saturated at current step.
|
||||
|
||||
filter out neighbors, no need to enter fully saturated fields
|
||||
|
||||
when counting
|
||||
on odd - around the S, on even - with S
|
||||
|
||||
but the neighboring fields would potentially (likely?) be in different phases
|
||||
|
||||
but i guess they are necessarily in different phases?
|
||||
or. if width odd - necessarily
|
||||
if width even - then what?
|
||||
|
||||
then S is not in the center
|
||||
|
||||
my input is 131 chars of width.
|
||||
so neighboring are necessarily of different phase.
|
||||
could compute phase of (0,0)
|
||||
and adjust from that
|
||||
** TODO remake 'ReachableBySteps' into 'CountReachableBySteps' returning int
|
||||
** TODO make it take 'isInitialCountOdd' - to know phase of {0,0} field
|
||||
current phase can be determined by initial phase and current N
|
||||
|
||||
if initial count is odd, and now it's odd number, we made even iterations, so (0,0) is in even state
|
||||
if initial count is even, and now it's even number, we made even iterations, so (0,0) is in even state
|
||||
|
||||
** DONE make neighbors take set of saturated fields
|
||||
and not produce points on those fields
|
||||
** DONE for field calculate what would be amount of points in each phase
|
||||
...........
|
||||
.....###.#.
|
||||
.###.##..#.
|
||||
..#.#...#..
|
||||
....#.#....
|
||||
.##..S####.
|
||||
.##..#...#.
|
||||
.......##..
|
||||
.##.#.####.
|
||||
.##..##.##.
|
||||
...........
|
||||
*** getting 39 and 42
|
||||
let's check
|
||||
42 is even?
|
||||
*** hmmm
|
||||
EOEOEOEOEOE
|
||||
OEOEO###O#O
|
||||
E###E##OE#E
|
||||
OE#E#EOE#EO
|
||||
EOEO#O#OEOE
|
||||
O##EOE####O
|
||||
E##OE#EOE#E
|
||||
OEOEOEO##EO
|
||||
E##O#O####E
|
||||
O##EO##E##O
|
||||
EOEOEOEOEOE
|
||||
*** yes, sounds good
|
||||
|
||||
|
||||
** CANCELLED after getting all new points. get coords of all fields we're working on.
|
||||
( there already should be no points in saturated fields )
|
||||
for each such field, check if it is saturated.
|
||||
|
||||
- can be done by comparing the phase with amount of points on saturated
|
||||
|
||||
if field saturated - add the coord into set
|
||||
and remove all the points
|
||||
** CANCELLED on the last step, when n is 0
|
||||
return len(startingAt) + (all saturated fields) * (amount of elems in their phase)
|
||||
** calculating points in even 7356 and odd 7321 phases
|
||||
* so need to scrap things and do a more analytics approach.
|
||||
no blocks on horizontal & vertical from (S)
|
||||
meaning diamond expands to left & right well
|
||||
* 26501365 = 202300 * 131 + 65 where 131 is the dimension of the grid
|
||||
* if there is a formula A*i^2 + B*i + C = D
|
||||
where i is full iteration
|
||||
* for initial steps :
|
||||
2023/12/21 13:25:23 after steps 65. full iter 0. got count 3701
|
||||
2023/12/21 13:25:24 after steps 196. full iter 1. got count 33108
|
||||
2023/12/21 13:25:27 after steps 327. full iter 2. got count 91853
|
||||
2023/12/21 13:25:42 after steps 458. full iter 3. got count 179936
|
||||
|
||||
* https://www.dcode.fr/newton-interpolating-polynomial
|
||||
14669x^2 + 14738*x+3701
|
||||
211
day21/stepCounter.go
Normal file
211
day21/stepCounter.go
Normal file
@@ -0,0 +1,211 @@
|
||||
package day21
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Print("hello day21")
|
||||
filename := "day21/input"
|
||||
field := ReadField(filename)
|
||||
log.Print(field)
|
||||
|
||||
// for i := 6; i <= 10; i++ {
|
||||
// reachableBySteps := field.ReachableBySteps(i, map[Coord]any{
|
||||
// Coord{Row: field.RowStart, Col: field.ColStart}: struct{}{},
|
||||
// })
|
||||
|
||||
// log.Print("reachable after steps : ", i, len(reachableBySteps))
|
||||
// field.PrintCoord(reachableBySteps, 1)
|
||||
// }
|
||||
|
||||
// initialSolutions := make(map[int]int)
|
||||
|
||||
// for fullIter := 0; fullIter < 4; fullIter++ {
|
||||
// steps := 65 + fullIter * 131
|
||||
// reachableBySteps := field.ReachableBySteps(steps, map[FieldPoint]any{
|
||||
// FieldPoint{
|
||||
// InField: Coord{Row: field.RowStart, Col: field.ColStart},
|
||||
// }: struct{}{},
|
||||
// })
|
||||
// log.Printf("after steps %d. full iter %d. got count %d", steps, fullIter, len(reachableBySteps))
|
||||
// initialSolutions[fullIter] = len(reachableBySteps)
|
||||
// }
|
||||
|
||||
log.Println("will try to use the values to get coeff of Ax^2 + Bx + C = 0")
|
||||
log.Println("then solve for x == 202300")
|
||||
// f(x) = 14714x^2 + 14603x + 3791
|
||||
// no.
|
||||
// 14669x^2 + 14738*x+3701
|
||||
|
||||
x := 202300
|
||||
result := 14669*x*x + 14738*x+3701
|
||||
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// let's do dijkstra?
|
||||
// i would need lots of space for edges?
|
||||
// let's use a map with minimal distances?
|
||||
// OR. just breath first traversal
|
||||
|
||||
type Field struct {
|
||||
RowStart, ColStart int
|
||||
symbols [][]rune
|
||||
}
|
||||
|
||||
type Coord struct {
|
||||
Row, Col int
|
||||
}
|
||||
|
||||
type FieldPoint struct {
|
||||
InField Coord
|
||||
MetaField Coord
|
||||
}
|
||||
|
||||
func (f Field) ReachableBySteps(n int, startingAt map[FieldPoint]any) map[FieldPoint]any {
|
||||
if n%100 == 0 {
|
||||
log.Println("going step: ", n)
|
||||
}
|
||||
if n == 0 {
|
||||
return startingAt
|
||||
}
|
||||
// else collect directly available
|
||||
|
||||
oneStepExpanded := make(map[FieldPoint]any)
|
||||
for cur := range startingAt {
|
||||
for _, neighbor := range f.Neighbors(cur) {
|
||||
oneStepExpanded[neighbor] = struct{}{}
|
||||
}
|
||||
}
|
||||
|
||||
// if n < 4 {
|
||||
// log.Print("reachable after steps : ", n, len(oneStepExpanded))
|
||||
// f.PrintCoord(oneStepExpanded, 5)
|
||||
// }
|
||||
|
||||
return f.ReachableBySteps(n-1, oneStepExpanded)
|
||||
}
|
||||
|
||||
func (f Field) Neighbors(c FieldPoint) (resut []FieldPoint) {
|
||||
closeCoords := []FieldPoint{
|
||||
{InField: Coord{Row: c.InField.Row + 1, Col: c.InField.Col}, MetaField: c.MetaField},
|
||||
{InField: Coord{Row: c.InField.Row - 1, Col: c.InField.Col}, MetaField: c.MetaField},
|
||||
{InField: Coord{Row: c.InField.Row, Col: c.InField.Col + 1}, MetaField: c.MetaField},
|
||||
{InField: Coord{Row: c.InField.Row, Col: c.InField.Col - 1}, MetaField: c.MetaField},
|
||||
}
|
||||
|
||||
for i, close := range closeCoords {
|
||||
height := len(f.symbols)
|
||||
width := len(f.symbols[0])
|
||||
if close.InField.Row == height {
|
||||
close.InField.Row = 0
|
||||
close.MetaField.Row += 1
|
||||
}
|
||||
if close.InField.Row == -1 {
|
||||
close.InField.Row = height - 1
|
||||
close.MetaField.Row -= 1
|
||||
}
|
||||
if close.InField.Col == width {
|
||||
close.InField.Col = 0
|
||||
close.MetaField.Col += 1
|
||||
}
|
||||
if close.InField.Col == -1 {
|
||||
// log.Printf("moving COL to lefter field from %d to %d", close.Col, width-1)
|
||||
close.InField.Col = width - 1
|
||||
close.MetaField.Col -= 1
|
||||
}
|
||||
closeCoords[i] = close
|
||||
// but this is not it. i need to store the XX and YY
|
||||
// so that points in other 'fields' would count separately. yuk
|
||||
}
|
||||
|
||||
for _, close := range closeCoords {
|
||||
if f.ValidCoord(close.InField.Row, close.InField.Col) {
|
||||
symb := f.symbols[close.InField.Row][close.InField.Col]
|
||||
if symb == '.' || symb == 'S' {
|
||||
resut = append(resut, close)
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
// log.Print("getting neighbors for ", c, resut)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (f Field) ValidCoord(row, col int) bool {
|
||||
// log.Print("check valid ", row, col, row >= 0 && row < len(f.symbols) && col >= 0 && col < len(f.symbols[0]))
|
||||
|
||||
valid := row >= 0 && row < len(f.symbols) && col >= 0 && col < len(f.symbols[0])
|
||||
if !valid {
|
||||
panic(fmt.Sprint("getting invalid coord: ", row, col))
|
||||
}
|
||||
return valid
|
||||
}
|
||||
|
||||
func (f Field) String() (result string) {
|
||||
result += "\n"
|
||||
for _, line := range f.symbols {
|
||||
result += string(line)
|
||||
result += "\n"
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func ReadField(filename string) (result Field) {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
|
||||
lines := strings.Split(text, "\n")
|
||||
rows := make([][]rune, len(lines))
|
||||
for rowNum, line := range lines {
|
||||
rows[rowNum] = []rune(line)
|
||||
for colNum, symb := range line {
|
||||
if symb == 'S' {
|
||||
result.RowStart = rowNum
|
||||
result.ColStart = colNum
|
||||
}
|
||||
}
|
||||
}
|
||||
result.symbols = rows
|
||||
return
|
||||
}
|
||||
|
||||
func (f Field) PrintCoord(coords map[FieldPoint]any, expandByField int) {
|
||||
|
||||
for fieldRow := -expandByField; fieldRow <= expandByField; fieldRow++ {
|
||||
lines := make([]string, len(f.symbols))
|
||||
for fieldCol := -expandByField; fieldCol <= expandByField; fieldCol++ {
|
||||
|
||||
for rowNum, row := range f.symbols {
|
||||
for colNum, col := range row {
|
||||
_, marked := coords[FieldPoint{InField: Coord{Row: rowNum, Col: colNum},
|
||||
MetaField: Coord{Row: fieldRow, Col: fieldCol}}]
|
||||
if marked {
|
||||
lines[rowNum] += "O"
|
||||
} else {
|
||||
lines[rowNum] += string(col)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
for _, line := range lines {
|
||||
fmt.Println(line)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
104
day22/block.go
Normal file
104
day22/block.go
Normal file
@@ -0,0 +1,104 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
"github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
type XY struct {
|
||||
X, Y uint
|
||||
}
|
||||
|
||||
type Block struct {
|
||||
NameNum int
|
||||
XMin, XMax uint
|
||||
YMin, YMax uint
|
||||
Z uint
|
||||
IsSettled bool
|
||||
ZHeight uint
|
||||
Supports mapset.Set[*Block]
|
||||
SupportedBy mapset.Set[*Block]
|
||||
}
|
||||
|
||||
func (b *Block) String() string {
|
||||
return fmt.Sprintf("[Block %d - x:%d-%d, y:%d-%d, z:%d, h:%d, isSettled %t]",
|
||||
b.NameNum, b.XMin, b.XMax, b.YMin, b.YMax, b.Z, b.ZHeight, b.IsSettled)
|
||||
}
|
||||
|
||||
func AtoIOrPanic(a string) int {
|
||||
n, err := strconv.Atoi(a)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return n
|
||||
}
|
||||
|
||||
func ReadBlock(line string, num int) (b Block) {
|
||||
b.NameNum = num
|
||||
re := regexp.MustCompile(`(\d+),(\d+),(\d+)~(\d+),(\d+),(\d+)`)
|
||||
matches := re.FindStringSubmatch(line)
|
||||
|
||||
x1, x2 := AtoIOrPanic(matches[1]), AtoIOrPanic(matches[4])
|
||||
y1, y2 := AtoIOrPanic(matches[2]), AtoIOrPanic(matches[5])
|
||||
z1, z2 := AtoIOrPanic(matches[3]), AtoIOrPanic(matches[6])
|
||||
|
||||
b.XMax = uint(max(x1, x2))
|
||||
b.XMin = uint(min(x1, x2))
|
||||
b.YMax = uint(max(y1, y2))
|
||||
b.YMin = uint(min(y1, y2))
|
||||
b.Z = uint(min(z1, z2))
|
||||
b.ZHeight = uint(max(z1, z2)) - b.Z
|
||||
|
||||
b.Supports = mapset.NewSet[*Block]()
|
||||
b.SupportedBy = mapset.NewSet[*Block]()
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (b *Block) getXY() (coords []XY) {
|
||||
for x := b.XMin; x <= b.XMax; x++ {
|
||||
for y := b.YMin; y <= b.YMax; y++ {
|
||||
coords = append(coords, XY{X: x, Y: y})
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func ReadBlockFile(filename string) (blocks []*Block) {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
for i, line := range strings.Split(text, "\n") {
|
||||
block := ReadBlock(line, i)
|
||||
blocks = append(blocks, &block)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func BlocksByZ(blocks []*Block) [][]*Block {
|
||||
maxZ := uint(0)
|
||||
for _, block := range blocks {
|
||||
if block.Z > maxZ {
|
||||
maxZ = block.Z
|
||||
}
|
||||
}
|
||||
log.Print("found max z: ", maxZ)
|
||||
|
||||
result := make([][]*Block, maxZ+1)
|
||||
|
||||
for _, block := range blocks {
|
||||
result[block.Z] = append(result[block.Z], block)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
37
day22/block_test.go
Normal file
37
day22/block_test.go
Normal file
@@ -0,0 +1,37 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestReadBlock(t *testing.T) {
|
||||
lines := `1,0,1~1,2,1
|
||||
0,0,2~2,0,2
|
||||
0,2,3~2,2,3
|
||||
0,0,4~0,2,4
|
||||
2,0,5~2,2,5
|
||||
0,1,6~2,1,6
|
||||
1,1,8~1,1,9`
|
||||
|
||||
for _, line := range strings.Split(lines, "\n") {
|
||||
b := ReadBlock(line, 0)
|
||||
t.Logf("read %s into block %+v", line, b)
|
||||
t.Logf("XY coords for %+v are : %+v", b, b.getXY())
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadFile(t *testing.T) {
|
||||
filename := "example"
|
||||
// filename := "input"
|
||||
blocks := ReadBlockFile(filename)
|
||||
|
||||
byZ := BlocksByZ(blocks)
|
||||
for z, zBlocks := range byZ {
|
||||
zBlocksLine := ""
|
||||
for _, block := range zBlocks {
|
||||
zBlocksLine += block.String()
|
||||
}
|
||||
t.Logf("for level %d blocks %+v", z, zBlocksLine)
|
||||
}
|
||||
}
|
||||
7
day22/example
Normal file
7
day22/example
Normal file
@@ -0,0 +1,7 @@
|
||||
1,0,1~1,2,1
|
||||
0,0,2~2,0,2
|
||||
0,2,3~2,2,3
|
||||
0,0,4~0,2,4
|
||||
2,0,5~2,2,5
|
||||
0,1,6~2,1,6
|
||||
1,1,8~1,1,9
|
||||
76
day22/notes.org
Normal file
76
day22/notes.org
Normal file
@@ -0,0 +1,76 @@
|
||||
#+title: Notes
|
||||
* ok. let's try this.
|
||||
i'd want to have block type
|
||||
with function to get it's XY coords
|
||||
|
||||
i'd want to settle blocks first.
|
||||
but if i store enough data, for example block.supports slice i'll be able to anser first task.
|
||||
|
||||
(settledOnZ) i would want [][]*Block per level from 0 to up. with references to blocks that settled on that level
|
||||
|
||||
(maxSettledXY) and for going from 0 up i'll want XY of the top block settled with it's level. i guess i could store settled level in the block as well
|
||||
|
||||
then for settling blocks, i will need (sorted map if data is sparse?) go from 0 up,
|
||||
order of processing for blocks on same z level is not important.
|
||||
for each block get it's XY, check maxSettledXY if there's a block check it's Z,
|
||||
for all block XY coords, find maximal settled Z, and refs to all blocks that are directly under with that same Z.
|
||||
|
||||
for the block set settledZ to Z+1, and for all those blocks add the block to 'supports'
|
||||
add block to settledOnZ[Z+1]
|
||||
|
||||
for the second part, i can scan all the blocks, don't even need the settledOnZ, just check if it's 'supports' is empty
|
||||
|
||||
** DONE block type
|
||||
store z, and have 'settledZ', maybe with default -1?
|
||||
** DONE coords type, func to get XY coords of the block
|
||||
** DONE now i guess what? do i want a sorted map? or just map from height to blocks on that hight?
|
||||
let's read file, and calc max height present?
|
||||
i suppose funciton to read file could also be initially entered via test, right?
|
||||
** DONE now go through the z levels, block by block, doing setting.
|
||||
i suppose i could organize setting methods around Space?
|
||||
it will store (settledOnZ) and (maxSettledOnXY)
|
||||
** DONE [#A] when i settle single block. the maxSettledOnXY - should use (z + height)
|
||||
** i can already imagine secon part? what is the most volume that can be disintegrated? or what? most volume is just all
|
||||
* part 1, wrong answer.
|
||||
i guess try to go, setting the input? block after block and try to check the calculations?
|
||||
|
||||
what i want to check:
|
||||
how maxSettledOnXY works, how linking works. maybe i'll find a problem in few steps =C
|
||||
** can't see anything just glancing around.
|
||||
maybe then trying to pick a block and track what's under it?
|
||||
* ok. let's try to brute force?
|
||||
for each block, remove it?
|
||||
create new space and try to settle it
|
||||
** this is shit. why blocks move up?
|
||||
2023/12/22 12:12:24 >>> starting for block [Block 1 - x:0-2, y:0-0, z:1, h:0, isSettled true] (supports [[Block 3 - x:0-0, y:0-2, z:3, h:0, isSettled true] [Block 4 - x:2-2, y:0-2, z:3, h:0, isSettled true] [Block 3 - x:0-0, y:0-2, z:2, h:0, isSettled true] [Block 4 - x:2-2, y:0-2, z:2, h:0, isSettled true]])
|
||||
2023/12/22 12:12:24 block [Block 2 - x:0-2, y:2-2, z:2, h:0, isSettled true] moved from 1 to 2
|
||||
2023/12/22 12:12:24 block [Block 3 - x:0-0, y:0-2, z:3, h:0, isSettled true] moved from 2 to 3
|
||||
2023/12/22 12:12:24 block [Block 4 - x:2-2, y:0-2, z:3, h:0, isSettled true] moved from 2 to 3
|
||||
2023/12/22 12:12:24 block [Block 5 - x:0-2, y:1-1, z:4, h:0, isSettled true] moved from 3 to 4
|
||||
2023/12/22 12:12:24 block [Block 6 - x:1-1, y:1-1, z:5, h:1, isSettled true] moved from 4 to 5
|
||||
2023/12/22 12:12:24 for block [Block 1 - x:0-2, y:0-0, z:1, h:0, isSettled true] new space has 5 moved
|
||||
* ok. brute force with copying slices worked.
|
||||
now i want to debug.
|
||||
|
||||
for each brick, when there is 0 falling, i want to check what are it's surroundings
|
||||
** my initial was : 567
|
||||
** checking example of badly determined:
|
||||
>> for block [Block 291 - x:6-8, y:7-7, z:75, h:0, isSettled false]
|
||||
checking under coord {X:6 Y:7}. found under [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]. ( 'overriding' ) with 35 ; maxZ 35
|
||||
directly supporting blocks are [[Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]]
|
||||
checking under coord {X:7 Y:7}. found under [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]. ( 'adding' ) with 35 ; maxZ 35
|
||||
directly supporting blocks are [[Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true] [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]]
|
||||
checking under coord {X:8 Y:7}. found under [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]. ( 'adding' ) with 35 ; maxZ 35
|
||||
directly supporting blocks are [[Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true] [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true] [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]]
|
||||
>> after settring block [Block 291 - x:6-8, y:7-7, z:36, h:0, isSettled true]. supported by [[Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true] [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true] [Block 698 - x:6-8, y:7-7, z:35, h:0, isSettled true]]
|
||||
** ouch. duplicates in slices. because there's no easy set thingy
|
||||
not doing this was my bug.
|
||||
|
||||
#+begin_src go
|
||||
slices.SortFunc(block.SupportedBy, func(a *Block, b *Block) int {
|
||||
return cmp.Compare(a.NameNum, b.NameNum)
|
||||
})
|
||||
block.SupportedBy = slices.Compact(block.SupportedBy)
|
||||
#+end_src
|
||||
* maybe rewrite with Set?
|
||||
* should have done that from the start
|
||||
38
day22/printingSpace.go
Normal file
38
day22/printingSpace.go
Normal file
@@ -0,0 +1,38 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"math"
|
||||
|
||||
"github.com/tidwall/pinhole"
|
||||
)
|
||||
|
||||
func TestPinhole() {
|
||||
p := pinhole.New()
|
||||
p.DrawCube(-0.3, -0.3, -0.3, 0.3, 0.3, 0.3)
|
||||
p.Rotate(math.Pi/3, math.Pi/2, 0)
|
||||
p.SavePNG("cube.png", 500, 500, nil)
|
||||
}
|
||||
|
||||
func PrintSpace(s Space, filename string) {
|
||||
// pinhole is from -1 to 1. let's use from 0 to 1.
|
||||
// so coord should be divided by max height, and let's hope that they are not too wide
|
||||
|
||||
rotation := []float64{math.Pi/3, math.Pi/6, 0}
|
||||
|
||||
p := pinhole.New()
|
||||
|
||||
p.DrawRect(-1, -1, 1, 1, 0)
|
||||
|
||||
for _, zLevel := range s.SettledOnZ {
|
||||
for _, block := range zLevel {
|
||||
p.DrawCube(float64(block.XMin) / float64(s.MaxZ),
|
||||
float64(block.YMin) / float64(s.MaxZ),
|
||||
float64(block.Z) / float64(s.MaxZ),
|
||||
float64(block.XMax + 1) / float64(s.MaxZ),
|
||||
float64(block.YMax + 1) / float64(s.MaxZ),
|
||||
float64(block.Z + block.ZHeight + 1) / float64(s.MaxZ))
|
||||
}
|
||||
}
|
||||
p.Rotate(rotation[0], rotation[1], rotation[2])
|
||||
p.SavePNG(filename, 1920, 1080, nil)
|
||||
}
|
||||
18
day22/sandSlabs.go
Normal file
18
day22/sandSlabs.go
Normal file
@@ -0,0 +1,18 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Print("oi, hello day 22")
|
||||
filename := "day22/input"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
space.SettleAll()
|
||||
|
||||
result := space.CountFreeBlocks()
|
||||
return result
|
||||
}
|
||||
206
day22/space.go
Normal file
206
day22/space.go
Normal file
@@ -0,0 +1,206 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"log"
|
||||
"slices"
|
||||
)
|
||||
|
||||
type Space struct {
|
||||
MaxZ uint
|
||||
SettledOnZ [][]*Block
|
||||
MaxSettledOnXY map[XY]*Block
|
||||
UnsettledByZ [][]*Block
|
||||
}
|
||||
|
||||
func NewSpace(blocksByZ [][]*Block) Space {
|
||||
return Space{
|
||||
UnsettledByZ: blocksByZ,
|
||||
MaxZ: uint(len(blocksByZ) - 1),
|
||||
MaxSettledOnXY: make(map[XY]*Block),
|
||||
SettledOnZ: make([][]*Block, len(blocksByZ)),
|
||||
}
|
||||
}
|
||||
|
||||
func (s *Space) AgainCountFreeBlocks() (result int) {
|
||||
for _, row := range s.SettledOnZ {
|
||||
for _, block := range row {
|
||||
thisSupports := block.Supports
|
||||
canDisintegrate := true
|
||||
for blockThisSupports := range thisSupports.Iter() {
|
||||
if blockThisSupports.SupportedBy.Cardinality() == 1 {
|
||||
// we cannot disintigrate this block
|
||||
canDisintegrate = false
|
||||
}
|
||||
}
|
||||
if canDisintegrate {
|
||||
result += 1
|
||||
}
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (s *Space) InitialCollectGoodToDisintegrate() (result []Block) {
|
||||
allBlocks := make(map[*Block]any)
|
||||
|
||||
for _, row := range s.SettledOnZ {
|
||||
for _, block := range row {
|
||||
allBlocks[block] = struct{}{}
|
||||
if block.SupportedBy.Cardinality() == 1 {
|
||||
onlySupport, _ := block.SupportedBy.Pop()
|
||||
log.Printf("in block %+v. only support is %+v", block, onlySupport)
|
||||
log.Printf("should be NOT OK to remove %+v", onlySupport)
|
||||
delete(allBlocks, onlySupport)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for block := range allBlocks {
|
||||
result = append(result, *block)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (s *Space) CountFreeBlocks() (result int) {
|
||||
return len(s.InitialCollectGoodToDisintegrate())
|
||||
}
|
||||
|
||||
func (s *Space) ThirdTimeCollectGoodToDisintegrate() (blocks []Block) {
|
||||
// for each block create a new space without it. try to settle and check if 0 moved
|
||||
log.Println(">>>>> starting hardcode count <<<<<")
|
||||
for rowNum, row := range s.SettledOnZ {
|
||||
for blockNum, block := range row {
|
||||
// log.Printf(">>> starting for block %+v (supports %+v)\n", block, block.Supports)
|
||||
newUnsettled := slices.Clone(s.SettledOnZ)
|
||||
for rowNum, row := range newUnsettled {
|
||||
newUnsettled[rowNum] = slices.Clone(row)
|
||||
}
|
||||
newUnsettled[rowNum] = slices.Delete(newUnsettled[rowNum], blockNum, blockNum+1)
|
||||
// and now copy the blocks
|
||||
for rowNum, row := range newUnsettled {
|
||||
for blockNum, block := range row {
|
||||
newBlock := *block
|
||||
newUnsettled[rowNum][blockNum] = &newBlock
|
||||
}
|
||||
}
|
||||
|
||||
newSpace := NewSpace(newUnsettled)
|
||||
moved := newSpace.SettleAll()
|
||||
if moved > 0 {
|
||||
// log.Printf("for block %+v new space has %d moved\n\n", block, moved)
|
||||
} else {
|
||||
// log.Printf("for block %+v new space has %d moved\n\n", block, moved)
|
||||
blocks = append(blocks, *block)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (s *Space) ThirdTimeCountFreeBlocks() (result int) {
|
||||
return len(s.ThirdTimeCollectGoodToDisintegrate())
|
||||
}
|
||||
|
||||
func (s *Space) CountChainReactoins() (result int) {
|
||||
for rowNum, row := range s.SettledOnZ {
|
||||
for blockNum, _ := range row {
|
||||
newUnsettled := slices.Clone(s.SettledOnZ)
|
||||
for rowNum, row := range newUnsettled {
|
||||
newUnsettled[rowNum] = slices.Clone(row)
|
||||
}
|
||||
newUnsettled[rowNum] = slices.Delete(newUnsettled[rowNum], blockNum, blockNum+1)
|
||||
// and now copy the blocks
|
||||
for rowNum, row := range newUnsettled {
|
||||
for blockNum, block := range row {
|
||||
newBlock := *block
|
||||
newUnsettled[rowNum][blockNum] = &newBlock
|
||||
}
|
||||
}
|
||||
|
||||
newSpace := NewSpace(newUnsettled)
|
||||
moved := newSpace.SettleAll()
|
||||
result += moved
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (s *Space) SettleAll() (totalMoved int) {
|
||||
for i := uint(1); i <= s.MaxZ; i++ {
|
||||
movedAfterLayer := s.SettleZ(i)
|
||||
totalMoved += movedAfterLayer
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
// settle all blocks in Z, remove Z from UnsettledByZ
|
||||
func (s *Space) SettleZ(z uint) (totalMoved int) {
|
||||
blocksToSettle := s.UnsettledByZ[int(z)]
|
||||
|
||||
for _, block := range blocksToSettle {
|
||||
hasMoved := s.SettleBlock(block)
|
||||
if hasMoved {
|
||||
totalMoved += 1
|
||||
}
|
||||
}
|
||||
|
||||
s.UnsettledByZ[int(z)] = nil
|
||||
return
|
||||
}
|
||||
|
||||
// for the block:
|
||||
// check all XY in MaxSettledOnXY
|
||||
// if there are any settled blocks on these XY, find max of their Z
|
||||
// for all blocks with that Z - add block to their 'supports'
|
||||
// set Z for block to Z+1, settled to true
|
||||
// add block as highest settled for all the XY
|
||||
// add block to MaxSettledOnXY
|
||||
func (s *Space) SettleBlock(block *Block) (hasMoved bool) {
|
||||
initialZ := block.Z
|
||||
underZMax := uint(0)
|
||||
underZBlocks := make([]*Block, 0)
|
||||
// fmt.Printf("\n>> for block %s\n", block)
|
||||
for _, xy := range block.getXY() {
|
||||
underBlock, found := s.MaxSettledOnXY[xy]
|
||||
// if block.NameNum
|
||||
if found {
|
||||
underBlockMaxZ := underBlock.Z + underBlock.ZHeight
|
||||
// action := " 'skipping' "
|
||||
if underBlockMaxZ > underZMax {
|
||||
underZBlocks = []*Block{underBlock}
|
||||
underZMax = underBlockMaxZ
|
||||
// action = " 'overriding' "
|
||||
} else if underBlockMaxZ == underZMax {
|
||||
underZBlocks = append(underZBlocks, underBlock)
|
||||
// action = " 'adding' "
|
||||
}
|
||||
// fmt.Printf("checking under coord %+v. found under %+v. (%s) with %d ; maxZ %d\n directly supporting blocks are %+v\n",
|
||||
// xy, underBlock, action, underBlockMaxZ, underZMax, underZBlocks)
|
||||
} else {
|
||||
// fmt.Printf("checking under coord %+v. nothing under\n", xy)
|
||||
}
|
||||
s.MaxSettledOnXY[xy] = block
|
||||
}
|
||||
|
||||
for _, settledUnderblock := range underZBlocks {
|
||||
settledUnderblock.Supports.Add(block)
|
||||
block.SupportedBy.Add(settledUnderblock)
|
||||
|
||||
}
|
||||
|
||||
block.Z = underZMax + 1
|
||||
block.IsSettled = true
|
||||
|
||||
s.SettledOnZ[block.Z] = append(s.SettledOnZ[block.Z], block)
|
||||
// fmt.Printf(">> after settring block %s. supported by %+v\n\n", block, block.SupportedBy)
|
||||
|
||||
// time.Sleep(500 * time.Millisecond)
|
||||
hasMoved = initialZ != block.Z
|
||||
// if hasMoved {
|
||||
// log.Printf("block %+v moved from %d to %d", block, initialZ, block.Z)
|
||||
// }
|
||||
return
|
||||
}
|
||||
135
day22/space_test.go
Normal file
135
day22/space_test.go
Normal file
@@ -0,0 +1,135 @@
|
||||
package day22
|
||||
|
||||
import (
|
||||
"slices"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestSpaceSettleSingle(t *testing.T) {
|
||||
filename := "example"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
t.Logf("read space %+v", space)
|
||||
|
||||
block := blocks[2]
|
||||
t.Logf("block before setting %+v", block)
|
||||
space.SettleBlock(block)
|
||||
t.Logf("space after settings %+v:\n%+v", block, space)
|
||||
}
|
||||
|
||||
func TestSpaceSettleSecondNearby(t *testing.T) {
|
||||
filename := "example"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
t.Logf("read space %+v", space)
|
||||
|
||||
block1 := blocks[0]
|
||||
block2 := blocks[3]
|
||||
t.Logf("block 1 before setting %+v", block1)
|
||||
space.SettleBlock(block1)
|
||||
t.Logf("space after settling block 1 %+v", space)
|
||||
t.Logf("block 2 before setting %+v", block2)
|
||||
space.SettleBlock(block2)
|
||||
t.Logf("space after settling block 2 %+v", space)
|
||||
t.Logf("space after settling %+v", space)
|
||||
}
|
||||
|
||||
func TestSpaceSettleThirdOnTopFirst(t *testing.T) {
|
||||
filename := "example"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
t.Logf("read space %+v", space)
|
||||
|
||||
block1 := blocks[0]
|
||||
block2 := blocks[3]
|
||||
block3 := blocks[2] // should overlap X & Y coords of block 1
|
||||
t.Logf("block 1 before setting %+v", block1)
|
||||
space.SettleBlock(block1)
|
||||
t.Logf("space after settling block 1 %+v", space)
|
||||
t.Logf("block 2 before setting %+v", block2)
|
||||
space.SettleBlock(block2)
|
||||
t.Logf("space after settling block 2 %+v", space)
|
||||
t.Logf("block 3 before setting %+v", block3)
|
||||
space.SettleBlock(block3)
|
||||
t.Logf("space after settling block 3 %+v", space)
|
||||
t.Logf("space after settling %+v", space)
|
||||
|
||||
t.Logf("blocks 1 & 3 should support it: %+v , %+v", block1.Supports, block2.Supports)
|
||||
// because block 3 is 0-2, 2-2
|
||||
// and that overlaps 1-1, 0-2 AND 0-0, 0-2
|
||||
t.Logf("other blocks should not supt %+v", block3.Supports)
|
||||
}
|
||||
|
||||
func TestSpaceExampleSettleAll(t *testing.T) {
|
||||
filename := "example"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
space.SettleAll()
|
||||
|
||||
t.Logf("settled space %+v", space)
|
||||
|
||||
// maybe i can check via console.
|
||||
i := 2
|
||||
t.Logf("level %d is : %+v", i, space.SettledOnZ[i])
|
||||
// it looks ok for the example.
|
||||
// let's hope?
|
||||
|
||||
t.Logf("for example, free blocks amount is %d", space.CountFreeBlocks())
|
||||
// oh, i need 'supported'?
|
||||
// how do i need to count the task question
|
||||
// i guess we can start with set of all blocks, then?
|
||||
// run over all, if some block is only supported by some underBlock - remove that underblock
|
||||
}
|
||||
|
||||
func TestPinholeStart(t *testing.T) {
|
||||
TestPinhole()
|
||||
}
|
||||
|
||||
func TestExampleSpacePrint(t *testing.T) {
|
||||
filename := "example"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
|
||||
// PrintSpace(space, "before-settping.png")
|
||||
|
||||
space.SettleAll()
|
||||
|
||||
PrintSpace(space, "after-settping.png")
|
||||
|
||||
}
|
||||
|
||||
func TestCompareInitialAndBruteforce(t *testing.T) {
|
||||
filename := "input"
|
||||
blocks := ReadBlockFile(filename)
|
||||
byZ := BlocksByZ(blocks)
|
||||
|
||||
space := NewSpace(byZ)
|
||||
|
||||
space.SettleAll()
|
||||
|
||||
initialBlocks := space.InitialCollectGoodToDisintegrate()
|
||||
correct := space.ThirdTimeCollectGoodToDisintegrate()
|
||||
|
||||
t.Log("len of initial solution : ", len(initialBlocks))
|
||||
t.Log("len of correct solution : ", len(correct))
|
||||
|
||||
for _, disintegratableInInitial := range initialBlocks {
|
||||
indexInCorrect := slices.IndexFunc(correct, func(e Block) bool {
|
||||
return e.NameNum == disintegratableInInitial.NameNum
|
||||
})
|
||||
if indexInCorrect == -1 {
|
||||
t.Logf("> found %+v. falsly marked as disintegratable\n\n", disintegratableInInitial)
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
26
day23/aLongWalk.go
Normal file
26
day23/aLongWalk.go
Normal file
@@ -0,0 +1,26 @@
|
||||
package day23
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
// length of longest scenic route
|
||||
func Run() int {
|
||||
fmt.Println("day 23")
|
||||
max := 0
|
||||
filename := "day23/input"
|
||||
field := ReadField(filename)
|
||||
fmt.Println(field.SparseString())
|
||||
// finalPaths := RunDFSTingy(field)
|
||||
// // log.Println(finalPaths)
|
||||
|
||||
// for _, path := range finalPaths {
|
||||
// if path.Visited.Cardinality() > max {
|
||||
// log.Println("one path len is ", path.Visited.Cardinality())
|
||||
// max = path.Visited.Cardinality()
|
||||
// }
|
||||
// }
|
||||
|
||||
return max
|
||||
}
|
||||
|
||||
23
day23/example
Normal file
23
day23/example
Normal file
@@ -0,0 +1,23 @@
|
||||
#.#####################
|
||||
#.......#########...###
|
||||
#######.#########.#.###
|
||||
###.....#.>.>.###.#.###
|
||||
###v#####.#v#.###.#.###
|
||||
###.>...#.#.#.....#...#
|
||||
###v###.#.#.#########.#
|
||||
###...#.#.#.......#...#
|
||||
#####.#.#.#######.#.###
|
||||
#.....#.#.#.......#...#
|
||||
#.#####.#.#.#########v#
|
||||
#.#...#...#...###...>.#
|
||||
#.#.#v#######v###.###v#
|
||||
#...#.>.#...>.>.#.###.#
|
||||
#####v#.#.###v#.#.###.#
|
||||
#.....#...#...#.#.#...#
|
||||
#.#########.###.#.#.###
|
||||
#...###...#...#...#.###
|
||||
###.###.#.###v#####v###
|
||||
#...#...#.#.>.>.#.>.###
|
||||
#.###.###.#.###.#.#v###
|
||||
#.....###...###...#...#
|
||||
#####################.#
|
||||
10
day23/example2
Normal file
10
day23/example2
Normal file
@@ -0,0 +1,10 @@
|
||||
#.#####################
|
||||
#.#####################
|
||||
#.##............#######
|
||||
#.##.##########.#######
|
||||
#....##########.#######
|
||||
####..#########.#######
|
||||
#####...........#######
|
||||
###############.#######
|
||||
###############.#######
|
||||
###############.#######
|
||||
175
day23/field.go
Normal file
175
day23/field.go
Normal file
@@ -0,0 +1,175 @@
|
||||
package day23
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
type Coord struct {
|
||||
Row, Col int
|
||||
}
|
||||
|
||||
type CellType rune
|
||||
|
||||
const (
|
||||
Path CellType = '.'
|
||||
Tree CellType = '#'
|
||||
SlideDown CellType = 'v'
|
||||
SlideUp CellType = '^'
|
||||
SlideLeft CellType = '<'
|
||||
SlideRight CellType = '>'
|
||||
)
|
||||
|
||||
type Field struct {
|
||||
MaxRow, MaxCol int
|
||||
Cells map[Coord]CellType
|
||||
StartCol, EndCol int
|
||||
}
|
||||
|
||||
func (f *Field) EndCoord() Coord {
|
||||
return Coord{Row: f.MaxRow, Col: f.EndCol}
|
||||
}
|
||||
|
||||
func (f *Field) NeighborsPart2(c Coord) (neighbors []Coord) {
|
||||
symb, exists := f.Cells[c]
|
||||
if !exists {
|
||||
panic(fmt.Sprintf("coord %+v not found in field", c))
|
||||
}
|
||||
|
||||
var coords []Coord
|
||||
switch symb {
|
||||
case Tree:
|
||||
panic(fmt.Sprintf("attempting to get neighbors of a tree at %+v", c))
|
||||
default:
|
||||
coords = []Coord{
|
||||
{Row: c.Row + 1, Col: c.Col},
|
||||
{Row: c.Row - 1, Col: c.Col},
|
||||
{Row: c.Row, Col: c.Col + 1},
|
||||
{Row: c.Row, Col: c.Col - 1},
|
||||
}
|
||||
}
|
||||
|
||||
for _, coord := range coords {
|
||||
neighborSymb, found := f.Cells[coord]
|
||||
if !found || neighborSymb == Tree {
|
||||
continue
|
||||
}
|
||||
neighbors = append(neighbors, coord)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (f *Field) Neighbors(c Coord) (neighbors []Coord) {
|
||||
symb, exists := f.Cells[c]
|
||||
if !exists {
|
||||
panic(fmt.Sprintf("coord %+v not found in field", c))
|
||||
}
|
||||
|
||||
var coords []Coord
|
||||
switch symb {
|
||||
case Path:
|
||||
coords = []Coord{
|
||||
{Row: c.Row + 1, Col: c.Col},
|
||||
{Row: c.Row - 1, Col: c.Col},
|
||||
{Row: c.Row, Col: c.Col + 1},
|
||||
{Row: c.Row, Col: c.Col - 1},
|
||||
}
|
||||
case Tree:
|
||||
panic(fmt.Sprintf("attempting to get neighbors of a tree at %+v", c))
|
||||
case SlideDown:
|
||||
coords = []Coord{{Row: c.Row + 1, Col: c.Col}}
|
||||
case SlideUp:
|
||||
coords = []Coord{{Row: c.Row - 1, Col: c.Col}}
|
||||
case SlideLeft:
|
||||
coords = []Coord{{Row: c.Row, Col: c.Col - 1}}
|
||||
case SlideRight:
|
||||
coords = []Coord{{Row: c.Row, Col: c.Col + 1}}
|
||||
}
|
||||
|
||||
for _, coord := range coords {
|
||||
neighborSymb, found := f.Cells[coord]
|
||||
if !found || neighborSymb == Tree {
|
||||
continue
|
||||
}
|
||||
neighbors = append(neighbors, coord)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (f *Field) String() (result string) {
|
||||
result += "\n"
|
||||
for row := 0; row <= f.MaxRow; row++ {
|
||||
for col := 0; col <= f.MaxCol; col++ {
|
||||
if row == 0 && col == f.StartCol {
|
||||
result += "S"
|
||||
continue
|
||||
}
|
||||
if row == f.MaxRow && col == f.EndCol {
|
||||
result += "E"
|
||||
continue
|
||||
}
|
||||
|
||||
symb := f.Cells[Coord{Row: row, Col: col}]
|
||||
result += string(symb)
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (f *Field) SparseString() (result string) {
|
||||
result += "\n"
|
||||
for row := 0; row <= f.MaxRow; row++ {
|
||||
for col := 0; col <= f.MaxCol; col++ {
|
||||
if row == 0 && col == f.StartCol {
|
||||
result += "S"
|
||||
continue
|
||||
}
|
||||
if row == f.MaxRow && col == f.EndCol {
|
||||
result += "E"
|
||||
continue
|
||||
}
|
||||
|
||||
symb := f.Cells[Coord{Row: row, Col: col}]
|
||||
if symb != Tree {
|
||||
neighbors := f.NeighborsPart2(Coord{Row: row, Col: col})
|
||||
if len(neighbors) > 2 {
|
||||
result += "o"
|
||||
} else {
|
||||
result += "."
|
||||
}
|
||||
} else {
|
||||
result += " "
|
||||
}
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func ReadField(filename string) (result Field) {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
lines := strings.Split(strings.TrimSpace(string(bytes)), "\n")
|
||||
result.MaxRow = len(lines) - 1
|
||||
result.MaxCol = len(lines[0]) - 1
|
||||
rows := make(map[Coord]CellType)
|
||||
|
||||
for rowNum, row := range lines {
|
||||
for colNum, symb := range row {
|
||||
rows[Coord{Row: rowNum, Col: colNum}] = CellType(symb)
|
||||
if rowNum == 0 && symb == rune(Path) {
|
||||
result.StartCol = colNum
|
||||
}
|
||||
if rowNum == result.MaxRow && symb == rune(Path) {
|
||||
result.EndCol = colNum
|
||||
}
|
||||
}
|
||||
}
|
||||
result.Cells = rows
|
||||
|
||||
return
|
||||
}
|
||||
31
day23/field_test.go
Normal file
31
day23/field_test.go
Normal file
@@ -0,0 +1,31 @@
|
||||
package day23
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestReadField(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
t.Log(field.String())
|
||||
}
|
||||
|
||||
func TestStartNeighbors(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
startNeighbors := field.Neighbors(Coord{Row: 0, Col: field.StartCol})
|
||||
t.Log(startNeighbors)
|
||||
}
|
||||
|
||||
// 5,3
|
||||
func TestForkNeighbors(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
startNeighbors := field.Neighbors(Coord{Row: 5, Col: 3})
|
||||
t.Log(startNeighbors)
|
||||
}
|
||||
|
||||
func TestSlideNeighbors(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
startNeighbors := field.Neighbors(Coord{Row: 6, Col: 3})
|
||||
t.Log(startNeighbors)
|
||||
}
|
||||
248
day23/graph.go
Normal file
248
day23/graph.go
Normal file
@@ -0,0 +1,248 @@
|
||||
package day23
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"slices"
|
||||
|
||||
mapset "github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
type Node struct {
|
||||
index int
|
||||
c Coord
|
||||
name string
|
||||
}
|
||||
func (n Node)Name() string {
|
||||
var r string
|
||||
if n.index < 25 {
|
||||
num := 'A' + n.index
|
||||
r = string(rune(num))
|
||||
} else {
|
||||
num := 'a' + n.index - 25
|
||||
r = string(rune(num))
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
type Graph struct {
|
||||
nodes map[Coord]Node
|
||||
nodesByIndex []Node
|
||||
edges [][]int // from, to, length. excluding from, including to
|
||||
}
|
||||
|
||||
func MaxDist(from, to Node) (result int) {
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func PrintFieldWithGraph(g Graph, f Field) (result string) {
|
||||
result += "\n"
|
||||
for row := 0; row <= f.MaxRow; row++ {
|
||||
for col := 0; col <= f.MaxCol; col++ {
|
||||
symb := f.Cells[Coord{Row: row, Col: col}]
|
||||
if symb != Tree {
|
||||
coord := Coord{Row: row, Col: col}
|
||||
node, exists := g.nodes[coord]
|
||||
if exists {
|
||||
result += fmt.Sprint(node.Name())
|
||||
} else {
|
||||
result += "."
|
||||
}
|
||||
} else {
|
||||
result += " "
|
||||
}
|
||||
}
|
||||
result += "\n"
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func CreateGraph(f Field) (g Graph) {
|
||||
startCoord := Coord{Row: 0, Col: f.StartCol}
|
||||
// directly below start
|
||||
initialPath := PathEnd{
|
||||
end: Coord{Row: 1, Col: f.StartCol}, visited: mapset.NewSet[Coord](),
|
||||
}
|
||||
|
||||
g = Graph{
|
||||
nodes: map[Coord]Node{
|
||||
startCoord: Node{
|
||||
index: 0,
|
||||
c: startCoord,
|
||||
name: "A",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
const presumedNodeCount = 36
|
||||
g.edges = make([][]int, presumedNodeCount)
|
||||
for i := 0; i < presumedNodeCount; i++ {
|
||||
g.edges[i] = make([]int, presumedNodeCount)
|
||||
}
|
||||
|
||||
recursiveGraphStep(f, initialPath, &g, startCoord, 1, mapset.NewSet[Coord]())
|
||||
g.edges[0][0] = 0
|
||||
|
||||
g.nodesByIndex = make([]Node, len(g.nodes))
|
||||
for _, node := range g.nodes {
|
||||
g.nodesByIndex[node.index] = node
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph)Neighbors(node Node) (nodes []Node) {
|
||||
index := node.index
|
||||
for toIndex, len := range g.edges[index] {
|
||||
if len > 0 {
|
||||
nodes = append(nodes, g.nodesByIndex[toIndex])
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
var maxSoFar int = -1
|
||||
func CheckMaxSoFar(maybe int) {
|
||||
if maybe > maxSoFar {
|
||||
maxSoFar = maybe
|
||||
}
|
||||
}
|
||||
|
||||
func (g *Graph) DFSLenOnGraph(atNode Node, visited mapset.Set[int],
|
||||
toNode Node, lenSoFar int) int {
|
||||
|
||||
if atNode == toNode {
|
||||
CheckMaxSoFar(lenSoFar)
|
||||
return lenSoFar
|
||||
}
|
||||
log.Printf("at %+v to %+v cur dist is %d.\t\t|max so far %d| \n", atNode, toNode, lenSoFar, maxSoFar)
|
||||
|
||||
neighbors := g.Neighbors(atNode)
|
||||
toVisit := slices.DeleteFunc(neighbors, func(n Node) bool {
|
||||
return visited.Contains(n.index)
|
||||
})
|
||||
|
||||
if len(toVisit) == 0 {
|
||||
return -1
|
||||
}
|
||||
max := -1
|
||||
|
||||
for _, nextNode := range toVisit {
|
||||
newVisited := visited.Clone()
|
||||
newVisited.Add(atNode.index)
|
||||
dist := g.edges[atNode.index][nextNode.index]
|
||||
maxFromNext := g.DFSLenOnGraph(nextNode, newVisited, toNode, lenSoFar + dist)
|
||||
if maxFromNext > max {
|
||||
max = maxFromNext
|
||||
}
|
||||
}
|
||||
|
||||
return max
|
||||
}
|
||||
|
||||
// run dfs, remembering from which remembers from which node we go, which path already traversed
|
||||
func recursiveGraphStep(f Field, p PathEnd, g *Graph, goingFrom Coord, goingLen int, visitedPathPoints mapset.Set[Coord]) {
|
||||
// log.Printf("entering coord %+v. from %+v with len %d\n", p.end, goingFrom, goingLen)
|
||||
|
||||
// if visitedPathPoints.Contains(p.end) {
|
||||
// return
|
||||
// }
|
||||
|
||||
neighbors := f.NeighborsPart2(p.end)
|
||||
|
||||
isCrossRoad := len(neighbors) > 2
|
||||
if isCrossRoad {
|
||||
log.Println("this should be crossroad ", p.end)
|
||||
}
|
||||
isStart := p.end == Coord{Row: 0, Col: f.StartCol}
|
||||
isEnd := p.end == f.EndCoord()
|
||||
if isEnd {
|
||||
log.Println("this should be end ", p.end)
|
||||
}
|
||||
|
||||
isNode := isCrossRoad || isStart || isEnd
|
||||
|
||||
continuedPaths := ExtendPath(p, f)
|
||||
|
||||
if !isNode {
|
||||
// just recurse into next paths, from same node, with increased len
|
||||
visitedPathPoints.Add(p.end)
|
||||
for _, nextStep := range continuedPaths {
|
||||
recursiveGraphStep(f, nextStep, g, goingFrom, goingLen+1, visitedPathPoints)
|
||||
}
|
||||
} else {
|
||||
node, known := g.nodes[p.end]
|
||||
// check if known, if not known - create
|
||||
if !known {
|
||||
node = Node{
|
||||
c: p.end,
|
||||
index: len(g.nodes),
|
||||
}
|
||||
node.name = node.Name()
|
||||
g.nodes[p.end] = node
|
||||
log.Printf("creating node %s %+v\n", node.Name(), node)
|
||||
}
|
||||
from := g.nodes[goingFrom]
|
||||
log.Printf("from %s to %s\n", from.Name(), node.Name())
|
||||
// and add vertices to currently traversed
|
||||
if g.edges[node.index][from.index] == 0 {
|
||||
g.edges[node.index][from.index] = goingLen
|
||||
g.edges[from.index][node.index] = goingLen
|
||||
} else {
|
||||
knownEdge := g.edges[node.index][from.index]
|
||||
if goingLen > knownEdge {
|
||||
g.edges[node.index][from.index] = goingLen
|
||||
g.edges[from.index][node.index] = goingLen
|
||||
}
|
||||
}
|
||||
// NOTE ah, it's possible to have two edges between i and j
|
||||
// but, i only need the longest one
|
||||
// log.Printf("adding edges between %d & %d of len %d\n", node.index, from.index, goingLen)
|
||||
|
||||
// continue with new 'from' and len of 1
|
||||
if !known {
|
||||
for _, nextStep := range continuedPaths {
|
||||
log.Printf("from %s should recurse to %+v", node.Name(), nextStep)
|
||||
recursiveGraphStep(f, nextStep, g, p.end, 1, visitedPathPoints)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func GraphToMermaid(g Graph) (result string) {
|
||||
result += "\nflowchart LR\n"
|
||||
lines := mapset.NewSet[string]()
|
||||
for _, node := range g.nodes {
|
||||
for to, len := range g.edges[node.index] {
|
||||
var toNode Node
|
||||
for _, other := range g.nodes {
|
||||
if other.index == to {
|
||||
toNode = other
|
||||
}
|
||||
}
|
||||
if len > 0 {
|
||||
var fromName, toName string
|
||||
if node.index < toNode.index {
|
||||
fromName = node.Name()
|
||||
toName = toNode.Name()
|
||||
} else {
|
||||
fromName = toNode.Name()
|
||||
toName = node.Name()
|
||||
}
|
||||
line := fmt.Sprintf("\t%s---|length %d|%s\n", fromName, len, toName)
|
||||
lines.Add(line)
|
||||
}
|
||||
}
|
||||
// result += fmt.Sprintf("%s--|%d|%s\n", a ...any)
|
||||
}
|
||||
|
||||
for line := range lines.Iter() {
|
||||
result += line
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
101
day23/graph_test.go
Normal file
101
day23/graph_test.go
Normal file
@@ -0,0 +1,101 @@
|
||||
package day23
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"testing"
|
||||
|
||||
mapset "github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
func TestGraphCreate(t *testing.T) {
|
||||
filename := "example2"
|
||||
field := ReadField(filename)
|
||||
|
||||
fmt.Println(field.SparseString())
|
||||
|
||||
graph := CreateGraph(field)
|
||||
t.Log(graph)
|
||||
}
|
||||
|
||||
func TestPrintGraph(t *testing.T) {
|
||||
filename := "example2"
|
||||
field := ReadField(filename)
|
||||
|
||||
fmt.Println(field.SparseString())
|
||||
|
||||
graph := CreateGraph(field)
|
||||
t.Log(PrintFieldWithGraph(graph, field))
|
||||
t.Logf(">>>\n %+v\n", graph)
|
||||
|
||||
}
|
||||
|
||||
func TestPrintGraphInput(t *testing.T) {
|
||||
filename := "input"
|
||||
field := ReadField(filename)
|
||||
|
||||
fmt.Println(field.SparseString())
|
||||
|
||||
graph := CreateGraph(field)
|
||||
t.Log(PrintFieldWithGraph(graph, field))
|
||||
t.Logf(">>>\n %+v\n", graph)
|
||||
|
||||
}
|
||||
|
||||
func TestPrintMermaidGraphInput(t *testing.T) {
|
||||
filename := "input"
|
||||
field := ReadField(filename)
|
||||
|
||||
fmt.Println(field.SparseString())
|
||||
|
||||
graph := CreateGraph(field)
|
||||
mmdContent := GraphToMermaid(graph)
|
||||
t.Log(mmdContent)
|
||||
|
||||
fileBorder, err := os.Create(filename + ".mmd")
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
defer func() {
|
||||
if err := fileBorder.Close(); err != nil {
|
||||
panic(err)
|
||||
}
|
||||
}()
|
||||
|
||||
fileBorder.WriteString(mmdContent)
|
||||
}
|
||||
|
||||
func TestGraphMaxBetweenExample(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
graph := CreateGraph(field)
|
||||
|
||||
t.Log(PrintFieldWithGraph(graph, field))
|
||||
|
||||
|
||||
from := graph.nodes[Coord{Row: 0, Col: field.StartCol}]
|
||||
to := graph.nodes[field.EndCoord()]
|
||||
|
||||
dist := graph.DFSLenOnGraph(from, mapset.NewSet[int](), to, 0)
|
||||
|
||||
t.Log(graph)
|
||||
t.Logf("please dist %d", dist)
|
||||
|
||||
}
|
||||
|
||||
func TestGraphMaxBetweenInput(t *testing.T) {
|
||||
filename := "input"
|
||||
field := ReadField(filename)
|
||||
|
||||
graph := CreateGraph(field)
|
||||
|
||||
t.Log(PrintFieldWithGraph(graph, field))
|
||||
from := graph.nodes[Coord{Row: 0, Col: field.StartCol}]
|
||||
to := graph.nodes[field.EndCoord()]
|
||||
|
||||
dist := graph.DFSLenOnGraph(from, mapset.NewSet[int](), to, 0)
|
||||
|
||||
t.Log(graph)
|
||||
t.Logf("please dist %d", dist)
|
||||
|
||||
}
|
||||
62
day23/input.mmd
Normal file
62
day23/input.mmd
Normal file
@@ -0,0 +1,62 @@
|
||||
|
||||
flowchart LR
|
||||
M---|length 166|N
|
||||
d---|length 62|h
|
||||
H---|length 190|I
|
||||
f---|length 136|h
|
||||
j---|length 94|k
|
||||
B---|length 152|L
|
||||
I---|length 40|J
|
||||
W---|length 56|X
|
||||
E---|length 214|F
|
||||
C---|length 60|K
|
||||
V---|length 142|b
|
||||
a---|length 110|b
|
||||
I---|length 138|P
|
||||
J---|length 184|K
|
||||
Y---|length 146|a
|
||||
c---|length 190|d
|
||||
Q---|length 114|T
|
||||
J---|length 240|O
|
||||
C---|length 184|D
|
||||
L---|length 172|M
|
||||
Q---|length 140|R
|
||||
Y---|length 464|k
|
||||
O---|length 76|V
|
||||
N---|length 102|O
|
||||
K---|length 152|L
|
||||
U---|length 80|c
|
||||
V---|length 72|W
|
||||
b---|length 202|j
|
||||
A---|length 39|B
|
||||
W---|length 236|a
|
||||
P---|length 166|Q
|
||||
e---|length 174|f
|
||||
G---|length 186|R
|
||||
T---|length 258|d
|
||||
X---|length 142|Y
|
||||
b---|length 128|c
|
||||
F---|length 378|G
|
||||
S---|length 108|T
|
||||
N---|length 62|W
|
||||
U---|length 110|V
|
||||
a---|length 138|k
|
||||
S---|length 234|e
|
||||
d---|length 108|e
|
||||
H---|length 166|Q
|
||||
O---|length 158|P
|
||||
M---|length 360|X
|
||||
h---|length 184|i
|
||||
B---|length 244|C
|
||||
D---|length 96|J
|
||||
D---|length 154|E
|
||||
R---|length 118|S
|
||||
E---|length 146|I
|
||||
P---|length 128|U
|
||||
T---|length 268|U
|
||||
i---|length 198|j
|
||||
G---|length 144|H
|
||||
F---|length 102|H
|
||||
f---|length 77|g
|
||||
K---|length 266|N
|
||||
c---|length 64|i
|
||||
125
day23/notes.org
Normal file
125
day23/notes.org
Normal file
@@ -0,0 +1,125 @@
|
||||
#+title: Notes
|
||||
* ok, second part is long.
|
||||
and here optimization of storing direction of enter into path, and it's length,
|
||||
would that be helpful?
|
||||
it might not, because based on visited some future longer path might not be available.
|
||||
|
||||
i don't know how to optimize.
|
||||
|
||||
i could maybe do same calculation in parallel, somehow
|
||||
put not into queue, but into channel
|
||||
* wait a second. previous answer was 2018
|
||||
and now long checks result in me waiting for intermediate 1882.
|
||||
let's add early cutoffs, if not end by 2018, then abandon
|
||||
doubt that's implementable
|
||||
|
||||
* well. do i want to try parallel?
|
||||
seems like false path, really
|
||||
like there should be a better optimizaiton first
|
||||
* maybe we could join detours into 'potential longest paths'
|
||||
like if we traverse, and get to a point which was previously visited,
|
||||
for evey path that went through that split path,
|
||||
i could check whether i can take paths that went through this point, and switch their part with the detoured part.
|
||||
* and maybe we could continue longest paths first?
|
||||
like making pathsToFurther a heap by visited.Cordinality ?
|
||||
|
||||
oh, and then we'll find 'some path to end'
|
||||
and additional paths will try to add their detour.
|
||||
|
||||
so, i guess when finding a path to end, i could save path to end for each point.
|
||||
then if i reach the point, i could check if i can use some of the
|
||||
|
||||
and i guess if i do depth first then i'll always have all paths to end from a point if i return to it?
|
||||
* this sounds like an idea.
|
||||
with heap do depth first.
|
||||
|
||||
if it's first visit to a point, just go further
|
||||
if i find the end, i'd want to mark all points on the path with path info
|
||||
|
||||
hm. recursive calls might make this easier.
|
||||
because i'd want both 'prefixVisited' set and totalPathSet
|
||||
|
||||
due to depth first, we'll discover shortest path first.
|
||||
and points will get mapped with this first (of potential multiple) path info to end.
|
||||
|
||||
now if on followup steps i get into the point with info on paths to end,
|
||||
that should mean that i've already found all paths to end from that point, right?
|
||||
|
||||
now i need to check for the 'detour' which 'paths to end' are still possible with that detour added
|
||||
by taking set of elements from this point, to end. and checking that intersection with detour elements is 0.
|
||||
|
||||
if there are like this - report finding a new path, and save to all elements of that path somehow.
|
||||
|
||||
and now on finding detours i wouldn't need to re-check path to end, that should save a lot of time
|
||||
** so how to go about in coding this?
|
||||
have shared map[Coord][]EndPathInfo
|
||||
|
||||
the DFS means i'm recursing into each child.
|
||||
and taking the result of the call.
|
||||
it should be info on path to end? or multiple paths to end.
|
||||
which should be added to current node.
|
||||
|
||||
and then calling with start point will return paths to end from start, and i'll be able to take the by length
|
||||
|
||||
ok. but. if i'm entering the coord, and there are already paths to end.
|
||||
then i need to presume that those are only possible paths to end from this point,
|
||||
because all other paths should have been explored by now,
|
||||
i for my 'detour' determine whether it is consistent with any of already found paths to end.
|
||||
** NO. dfs doesn't mean i'll find shortest path first.
|
||||
so if i'm in visited, it doesn't mean that stored is shorter and current is a detour.
|
||||
|
||||
but dfs should mean that all paths from this prefix have finished.
|
||||
so, sure. there have to be all done?
|
||||
** my example2 has fork on row 3 col 10
|
||||
so 4,10 and 3,11 should be visited separately.
|
||||
|
||||
6,17 is where they join and the point which should have second entry
|
||||
** allright, ugh. my new solution is memory hogging.
|
||||
maybe i can draw the stuff and it will be several neat thingies
|
||||
* maybe new approach?
|
||||
make a graph. with vertices of Start, End and Crossroads.
|
||||
yes.
|
||||
let's create a graph representation.
|
||||
** so, from A to AG
|
||||
i think i can do this manually now
|
||||
** distances are
|
||||
39
|
||||
244
|
||||
184
|
||||
154
|
||||
214
|
||||
378
|
||||
144
|
||||
190
|
||||
40
|
||||
184
|
||||
152
|
||||
172
|
||||
166
|
||||
102
|
||||
158
|
||||
166
|
||||
140
|
||||
118
|
||||
108
|
||||
268
|
||||
110
|
||||
72
|
||||
56
|
||||
142
|
||||
146
|
||||
110
|
||||
128
|
||||
190
|
||||
108
|
||||
174
|
||||
77
|
||||
1
|
||||
** again?
|
||||
no, let's write code.
|
||||
** didn't count all the way
|
||||
2023/12/23 15:55:55 at {index:30 c:{Row:125 Col:137} name:f} to {index:31 c:{Row:140 Col:139} name:g} cur dist is 3997. |max so far 6406|
|
||||
signal: interrupt
|
||||
FAIL sunshine.industries/aoc2023/day23 380.499s
|
||||
|
||||
tried more or less stable value, and interrupted
|
||||
161
day23/paths.go
Normal file
161
day23/paths.go
Normal file
@@ -0,0 +1,161 @@
|
||||
package day23
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
|
||||
mapset "github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
type PathEnd struct {
|
||||
end Coord
|
||||
visited mapset.Set[Coord]
|
||||
}
|
||||
func (p PathEnd)Sring() string {
|
||||
return fmt.Sprintf("PathEnd[at %+v, visited: %+v]", p.end, p.visited)
|
||||
}
|
||||
|
||||
func ExtendPath(p PathEnd, f Field) (nextPaths []PathEnd) {
|
||||
endPointNeighbors := f.NeighborsPart2(p.end)
|
||||
for _, potentialNewEnd := range endPointNeighbors {
|
||||
if !p.visited.Contains(potentialNewEnd) {
|
||||
nextVisited := p.visited.Clone()
|
||||
nextVisited.Add(p.end)
|
||||
nextPaths = append(nextPaths, PathEnd{
|
||||
end: potentialNewEnd,
|
||||
visited: nextVisited,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
// info on path from start to end
|
||||
type PathInfo struct {
|
||||
Visited mapset.Set[Coord]
|
||||
}
|
||||
|
||||
func RunDFSTingy(f Field) []PathInfo {
|
||||
initialPath := PathEnd{
|
||||
end: Coord{Row: 0, Col: f.StartCol}, visited: mapset.NewSet[Coord](),
|
||||
}
|
||||
initialShared := make(map[Coord][]PathInfo)
|
||||
|
||||
return DFSScenicPaths(f, initialPath, initialShared)
|
||||
}
|
||||
|
||||
var knownMax int = 0
|
||||
func CheckAndPrintMax(maybeNewMax int) {
|
||||
if maybeNewMax > knownMax {
|
||||
log.Printf("\n\n>>>>found new max: %d\n", maybeNewMax)
|
||||
knownMax = maybeNewMax
|
||||
}
|
||||
}
|
||||
|
||||
func DFSScenicPaths(f Field, curPath PathEnd,
|
||||
sharedMem map[Coord][]PathInfo) (pathsFromTheStartToEnd []PathInfo) {
|
||||
curCoord := curPath.end
|
||||
|
||||
if curCoord == (Coord{ Row: 6, Col: 15 }) {
|
||||
log.Println(">>>>>>>>")
|
||||
}
|
||||
// log.Printf("entering %+v with mem %+v\n", curPath, sharedMem[curCoord])
|
||||
|
||||
if curCoord == f.EndCoord() {
|
||||
pathsFromTheStartToEnd = append(pathsFromTheStartToEnd, PathInfo{curPath.visited.Clone()})
|
||||
log.Printf("got to end. cur len is %d\n", curPath.visited.Cardinality())
|
||||
CheckAndPrintMax(curPath.visited.Cardinality())
|
||||
// i guess return only from current to end?
|
||||
// and on non terminal first time, return copy with self added?
|
||||
return
|
||||
}
|
||||
|
||||
// now for non final point
|
||||
knownPaths, visitedBefore := sharedMem[curCoord]
|
||||
|
||||
// NOTE but this only if we haven't visited this coord before!
|
||||
|
||||
if !visitedBefore {
|
||||
nextSteps := ExtendPath(curPath, f)
|
||||
suffixesFromCurToEnd := make([]PathInfo, 0)
|
||||
for _, nextPath := range nextSteps {
|
||||
pathsToEndThrough := DFSScenicPaths(f, nextPath, sharedMem)
|
||||
// i guess here deduct the prefix.
|
||||
|
||||
for _, path := range pathsToEndThrough {
|
||||
// will contain this and further
|
||||
suffix := PathInfo{
|
||||
Visited: path.Visited.Difference(curPath.visited).Clone(),
|
||||
}
|
||||
// log.Printf(">> from path \n%+v make suffix \n%+v\n\n", path, suffix)
|
||||
suffixesFromCurToEnd = append(suffixesFromCurToEnd, suffix)
|
||||
}
|
||||
|
||||
pathsFromTheStartToEnd = append(pathsFromTheStartToEnd, pathsToEndThrough...)
|
||||
|
||||
if len(pathsToEndThrough) != 0 {
|
||||
// log.Printf("setting mem for %+v to %+v", curCoord, suffixesFromCurToEnd)
|
||||
sharedMem[curCoord] = suffixesFromCurToEnd
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
} else {
|
||||
// have visited this point before, due to dfs all possible paths to end should already be known
|
||||
// so curPath.visited should contian a detour.
|
||||
// need to figure out if this detour is compatible with any of the known paths to end
|
||||
// from those create 'new' paths to end with that detour
|
||||
// return those and add those to the shared mem
|
||||
for _, knownPathToEnd := range knownPaths {
|
||||
// those are all points through which this known path goes from current to end
|
||||
// if our curPath
|
||||
fromCurToEnd := knownPathToEnd.Visited
|
||||
thisPrefix := curPath.visited
|
||||
|
||||
if thisPrefix.Intersect(fromCurToEnd).Cardinality() == 0 {
|
||||
// then current prefix is compatible with this path.
|
||||
fromCurPrefixToEnd := thisPrefix.Clone()
|
||||
fromCurPrefixToEnd.Union(fromCurToEnd)
|
||||
pathsFromTheStartToEnd = append(pathsFromTheStartToEnd, PathInfo{fromCurPrefixToEnd})
|
||||
log.Printf("additional path to end of len %d\n", fromCurPrefixToEnd.Cardinality())
|
||||
CheckAndPrintMax(fromCurPrefixToEnd.Cardinality())
|
||||
}
|
||||
}
|
||||
log.Printf("having second visit into %+v.\n", curPath)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
panic("should not be reachable")
|
||||
}
|
||||
|
||||
// return paths that end on End
|
||||
func RunAllScenicPaths(f Field) (result []PathEnd) {
|
||||
pathsToFurther := []PathEnd{
|
||||
{end: Coord{Row: 0, Col: f.StartCol}, visited: mapset.NewSet[Coord]()},
|
||||
}
|
||||
|
||||
for len(pathsToFurther) > 0 {
|
||||
curCheckedPath := pathsToFurther[0]
|
||||
pathsToFurther = pathsToFurther[1:]
|
||||
|
||||
if curCheckedPath.end == f.EndCoord() {
|
||||
result = append(result, curCheckedPath)
|
||||
// log.Printf("found end path of len %d . %+v", curCheckedPath.visited.Cardinality(), curCheckedPath)
|
||||
continue
|
||||
}
|
||||
|
||||
nextSteps := ExtendPath(curCheckedPath, f)
|
||||
|
||||
// log.Printf("for %+v next steps %+v\n", curCheckedPath, pathsToFurther)
|
||||
// log.Printf("remaining paths to check len is %d", len(pathsToFurther))
|
||||
// log.Println(pathsToFurther)
|
||||
|
||||
if len(nextSteps) > 0 {
|
||||
pathsToFurther = append(pathsToFurther, nextSteps...)
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
19
day23/paths_test.go
Normal file
19
day23/paths_test.go
Normal file
@@ -0,0 +1,19 @@
|
||||
package day23
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestRunAllPaths(t *testing.T) {
|
||||
filename := "example"
|
||||
field := ReadField(filename)
|
||||
finalPaths := RunAllScenicPaths(field)
|
||||
t.Log(finalPaths)
|
||||
|
||||
max := 0
|
||||
for _, path := range finalPaths {
|
||||
if path.visited.Cardinality() > max {
|
||||
max = path.visited.Cardinality()
|
||||
}
|
||||
}
|
||||
t.Logf("max path len is %d", max)
|
||||
|
||||
}
|
||||
72
day23/willHelp.mmd
Normal file
72
day23/willHelp.mmd
Normal file
@@ -0,0 +1,72 @@
|
||||
flowchart LR
|
||||
L---|length 152|K
|
||||
L---|length 172|M
|
||||
U---|length 268|T
|
||||
U---|length 110|V
|
||||
W---|length 72|V
|
||||
W---|length 56|X
|
||||
a---|length 146|Y
|
||||
a---|length 110|b
|
||||
f---|length 174|e
|
||||
f---|length 77|g
|
||||
f---|length 136|h
|
||||
H---|length 144|G
|
||||
H---|length 190|I
|
||||
T---|length 108|S
|
||||
T---|length 268|U
|
||||
M---|length 172|L
|
||||
M---|length 166|N
|
||||
F---|length 214|E
|
||||
F---|length 378|G
|
||||
I---|length 190|H
|
||||
I---|length 40|J
|
||||
A---|length 2|A
|
||||
A---|length 39|B
|
||||
Q---|length 166|P
|
||||
Q---|length 140|R
|
||||
Y---|length 142|X
|
||||
Y---|length 146|a
|
||||
d---|length 190|c
|
||||
d---|length 108|e
|
||||
e---|length 108|d
|
||||
e---|length 174|f
|
||||
h---|length 136|f
|
||||
h---|length 184|i
|
||||
J---|length 40|I
|
||||
J---|length 184|K
|
||||
N---|length 166|M
|
||||
N---|length 102|O
|
||||
X---|length 56|W
|
||||
X---|length 142|Y
|
||||
j---|length 198|i
|
||||
j---|length 94|k
|
||||
B---|length 39|A
|
||||
B---|length 244|C
|
||||
G---|length 378|F
|
||||
G---|length 144|H
|
||||
P---|length 158|O
|
||||
P---|length 166|Q
|
||||
D---|length 184|C
|
||||
D---|length 154|E
|
||||
E---|length 154|D
|
||||
E---|length 214|F
|
||||
K---|length 184|J
|
||||
K---|length 152|L
|
||||
O---|length 102|N
|
||||
O---|length 158|P
|
||||
R---|length 140|Q
|
||||
R---|length 118|S
|
||||
S---|length 118|R
|
||||
S---|length 108|T
|
||||
V---|length 110|U
|
||||
V---|length 72|W
|
||||
c---|length 128|b
|
||||
c---|length 190|d
|
||||
C---|length 244|B
|
||||
C---|length 184|D
|
||||
k---|length 94|j
|
||||
i---|length 184|h
|
||||
i---|length 198|j
|
||||
g---|length 77|f
|
||||
b---|length 110|a
|
||||
b---|length 128|c
|
||||
BIN
day23/willHelp.png
Normal file
BIN
day23/willHelp.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 203 KiB |
5
day24/example
Normal file
5
day24/example
Normal file
@@ -0,0 +1,5 @@
|
||||
19, 13, 30 @ -2, 1, -2
|
||||
18, 19, 22 @ -1, -1, -2
|
||||
20, 25, 34 @ -2, -2, -4
|
||||
12, 31, 28 @ -1, -2, -1
|
||||
20, 19, 15 @ 1, -5, -3
|
||||
73
day24/hailMary.go
Normal file
73
day24/hailMary.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package day24
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
// most inner loop
|
||||
// assumint stone hits h1 at t1, and h2 at t2
|
||||
// return the line. so 'HailParam' for my stone trajectory
|
||||
func AssumeHails(h1, h2 HailParam, t1, t2 int) (stoneTrajectory HailParam, isInt bool) {
|
||||
Dx, isXInt := AssumedDelta(h1.p0.x, h2.p0.x, h1.Dx, h2.Dx, t1, t2)
|
||||
Dy, isYInt := AssumedDelta(h1.p0.y, h2.p0.y, h1.Dy, h2.Dy, t1, t2)
|
||||
Dz, isZInt := AssumedDelta(h1.p0.z, h2.p0.z, h1.Dz, h2.Dz, t1, t2)
|
||||
|
||||
isInt = isXInt && isYInt && isZInt
|
||||
|
||||
x := AssumedStartFromDelta(h1.p0.x, h1.Dx, t1, Dx)
|
||||
y := AssumedStartFromDelta(h1.p0.y, h1.Dy, t1, Dy)
|
||||
z := AssumedStartFromDelta(h1.p0.z, h1.Dz, t1, Dz)
|
||||
|
||||
stoneTrajectoryLine := fmt.Sprintf("%d, %d, %d @ %d, %d, %d", x, y, z, Dx, Dy, Dz)
|
||||
stoneTrajectory = ReadHailLine(stoneTrajectoryLine)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func HailMaryLoop(hails []HailParam) {
|
||||
// for t1, t2 from [1, 100]
|
||||
// try to fit stoneTrajectory on every pair of hails.
|
||||
// and hope for integer fit
|
||||
for t1 := 1; t1 <= 100; t1++ {
|
||||
for t2 := t1+1 ; t2 <= 100; t2++ {
|
||||
for i, hail := range hails {
|
||||
innerHail:
|
||||
for j, otherHail := range hails {
|
||||
if i == j {
|
||||
continue innerHail
|
||||
}
|
||||
_, isInt := AssumeHails(hail, otherHail, t1, t2)
|
||||
if !isInt {
|
||||
continue innerHail // TODO first hope to loose
|
||||
}
|
||||
// if isInt {
|
||||
// log.Printf("hail mary int fit between %s (%d) and %s (%d)",
|
||||
// hail.SomeString(), t1, otherHail.SomeString(), t2)
|
||||
// }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// TODO check for inner loop : when get assumed stoneTrajectory
|
||||
// for all hail params, check that they intercept
|
||||
// func CheckAssumedTrajectory(assumedStone HailParam, hails []HailParam) bool {
|
||||
// for _, hail := range hails {
|
||||
// // i guess i could try to do what?
|
||||
// // assume oh, no. there can be t whatever
|
||||
// }
|
||||
// }
|
||||
|
||||
func AssumedDelta(c1, c2 int, Dc1, Dc2 int, t1, t2 int) (delta int, isInt bool) {
|
||||
divisor := t1 - t2
|
||||
divisible := c1 - c2 + (t1 * Dc1) - (t2 * Dc2)
|
||||
|
||||
isInt = divisible % divisor == 0
|
||||
delta = divisible / divisor
|
||||
return
|
||||
}
|
||||
|
||||
func AssumedStartFromDelta(c1 int, Dc1 int, t1, Dc int) (c int) {
|
||||
return c1 + t1 * Dc1 - t1 * Dc
|
||||
}
|
||||
9
day24/hailMary_test.go
Normal file
9
day24/hailMary_test.go
Normal file
@@ -0,0 +1,9 @@
|
||||
package day24
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestHailMaryOnExamle(t *testing.T) {
|
||||
filename := "input"
|
||||
hails := ReadHailFile(filename)
|
||||
HailMaryLoop(hails)
|
||||
}
|
||||
196
day24/lines.go
Normal file
196
day24/lines.go
Normal file
@@ -0,0 +1,196 @@
|
||||
package day24
|
||||
|
||||
import (
|
||||
"log"
|
||||
"os"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
const (
|
||||
// CoordMin int = 7
|
||||
// CoordMax int = 27
|
||||
CoordMin int = 200000000000000
|
||||
CoordMax int = 400000000000000
|
||||
)
|
||||
|
||||
type Point struct {
|
||||
x, y, z int
|
||||
}
|
||||
|
||||
type HailParam struct {
|
||||
p0, p1 Point
|
||||
Dx, Dy, Dz int
|
||||
line string
|
||||
// for 2d : ay + bx = 0
|
||||
a, b, c int
|
||||
// for 2d : y = slope*x + shift
|
||||
slope, shift float64
|
||||
}
|
||||
|
||||
func (h *HailParam) SomeString() string {
|
||||
return h.line
|
||||
}
|
||||
|
||||
func (h *HailParam) GetCoord(name string) (result int) {
|
||||
switch name {
|
||||
case "x":
|
||||
result = h.p0.x
|
||||
case "y":
|
||||
result = h.p0.y
|
||||
case "z":
|
||||
result = h.p0.z
|
||||
default:
|
||||
panic("unknown param")
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (h *HailParam) GetSpeedOf(name string) (result int) {
|
||||
switch name {
|
||||
case "x":
|
||||
result = h.Dx
|
||||
case "y":
|
||||
result = h.Dy
|
||||
case "z":
|
||||
result = h.Dz
|
||||
default:
|
||||
panic("unknown param")
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func CheckPairwiseIntersections(hails []HailParam) (totalIntersections int) {
|
||||
for i, hail := range hails {
|
||||
for j := i + 1; j < len(hails); j++ {
|
||||
otherHail := hails[j]
|
||||
intersect := CheckTaskIntersection(hail, otherHail)
|
||||
if intersect {
|
||||
totalIntersections += 1
|
||||
}
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func CheckTaskIntersection(h1, h2 HailParam) (doIntersect bool) {
|
||||
log.Printf("intersecting %+v and %+v\n", h1, h2)
|
||||
// x, y, intersectAtAll := IntersectByTwoPoints(h1, h2)
|
||||
x, y, intersectAtAll := IntersectBySlopeAndShift(h1, h2)
|
||||
if !intersectAtAll {
|
||||
log.Println("no intersection at all\n", x, y)
|
||||
return false
|
||||
}
|
||||
isH1Future := h1.FloatPointInFuture(x, y)
|
||||
isH2Future := h2.FloatPointInFuture(x, y)
|
||||
|
||||
if !isH1Future {
|
||||
log.Printf("point %f, %f in the past for h1\n", x, y)
|
||||
}
|
||||
if !isH2Future {
|
||||
log.Printf("point %f, %f in the past for h2\n", x, y)
|
||||
}
|
||||
if !isH1Future || !isH2Future {
|
||||
return false
|
||||
}
|
||||
|
||||
if x < float64(CoordMin) || x > float64(CoordMax) ||
|
||||
y < float64(CoordMin) || y > float64(CoordMax) {
|
||||
log.Printf("intersect at %f %f but outside of area\n", x, y)
|
||||
return false // outside of area
|
||||
}
|
||||
|
||||
log.Println("> intersect inside of the area! ", x, y)
|
||||
return true
|
||||
}
|
||||
|
||||
func IntersectInTheeDimentions(h1, h2 HailParam) (interX, interY, interZ float64,
|
||||
interT float64, isIntersecting bool) {
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func IntersectBySlopeAndShift(h1, h2 HailParam) (intersectionX, intersectionY float64, isIntersecting bool) {
|
||||
if h1.slope == h2.slope {
|
||||
return
|
||||
}
|
||||
// y = slope * x + shift
|
||||
// slope1 * x + shift1 = slope2 * x + shift2
|
||||
// x = ( shift2 - shift1 ) / (slope1 - slope2)
|
||||
|
||||
x := (h2.shift - h1.shift) / (h1.slope - h2.slope)
|
||||
y := h1.slope*x + h1.shift
|
||||
|
||||
return x, y, true
|
||||
}
|
||||
|
||||
func (h HailParam) PointInFuture(p Point) bool {
|
||||
xPositiveSteps := (p.x-h.p0.x)*h.Dx >= 0
|
||||
yPositiveSteps := (p.y-h.p0.y)*h.Dy >= 0
|
||||
zPositiveSteps := (p.z-h.p0.z)*h.Dz >= 0
|
||||
return xPositiveSteps && yPositiveSteps && zPositiveSteps
|
||||
}
|
||||
func (h HailParam) FloatPointInFuture(x, y float64) bool {
|
||||
xPositiveSteps := (x-float64(h.p0.x))*float64(h.Dx) >= 0
|
||||
// yPositiveSteps := (y - float64(h.p0.y)) * float64(h.Dy) >= 0
|
||||
// return xPositiveSteps && yPositiveSteps
|
||||
return xPositiveSteps
|
||||
}
|
||||
|
||||
// 19, 13, 30 @ -2, 1, -2
|
||||
func ReadHailLine(line string) (h HailParam) {
|
||||
h.line = line
|
||||
line = strings.ReplaceAll(line, "@", "")
|
||||
line = strings.ReplaceAll(line, ",", "")
|
||||
fields := strings.Fields(line)
|
||||
|
||||
h.p0.x = AtoIOrPanic(fields[0])
|
||||
h.p0.y = AtoIOrPanic(fields[1])
|
||||
h.p0.z = AtoIOrPanic(fields[2])
|
||||
h.Dx = AtoIOrPanic(fields[3])
|
||||
h.Dy = AtoIOrPanic(fields[4])
|
||||
h.Dz = AtoIOrPanic(fields[5])
|
||||
|
||||
countP1AfterMillis := 1
|
||||
|
||||
h.p1.x = h.p0.x + countP1AfterMillis*h.Dx
|
||||
h.p1.y = h.p0.y + countP1AfterMillis*h.Dy
|
||||
h.p1.z = h.p0.z + countP1AfterMillis*h.Dz
|
||||
|
||||
h.a = h.p0.y - h.p1.y
|
||||
h.b = h.p1.x - h.p0.x
|
||||
h.c = -(h.p0.x*h.p1.y - h.p1.x*h.p0.y)
|
||||
|
||||
h.slope = float64(h.Dy) / float64(h.Dx)
|
||||
// y = slope * x + shift
|
||||
// shift = y - slope * x // for some point
|
||||
h.shift = float64(h.p0.y) - h.slope*float64(h.p0.x)
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func ReadHailFile(filename string) []HailParam {
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
lines := strings.Split(text, "\n")
|
||||
result := make([]HailParam, len(lines))
|
||||
|
||||
for i, line := range lines {
|
||||
result[i] = ReadHailLine(line)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func AtoIOrPanic(str string) (num int) {
|
||||
num, err := strconv.Atoi(str)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return
|
||||
}
|
||||
87
day24/lines_test.go
Normal file
87
day24/lines_test.go
Normal file
@@ -0,0 +1,87 @@
|
||||
package day24
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestReadLine(t *testing.T) {
|
||||
lines := `19, 13, 30 @ -2, 1, -2
|
||||
18, 19, 22 @ -1, -1, -2
|
||||
20, 25, 34 @ -2, -2, -4
|
||||
12, 31, 28 @ -1, -2, -1
|
||||
20, 19, 15 @ 1, -5, -3`
|
||||
|
||||
for _, line := range strings.Split(lines, "\n") {
|
||||
hail := ReadHailLine(line)
|
||||
t.Log(hail)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestReadLineInput(t *testing.T) {
|
||||
lines := `147847636573416, 190826994408605, 140130741291716 @ 185, 49, 219
|
||||
287509258905812, 207449079739538, 280539021150559 @ -26, 31, 8
|
||||
390970075767404, 535711685410735, 404166182422876 @ -147, -453, -149
|
||||
306391780523937, 382508967958270, 264612201472049 @ -24, -274, 28
|
||||
278063616684570, 510959526404728, 288141792965603 @ -18, -441, -6`
|
||||
for _, line := range strings.Split(lines, "\n") {
|
||||
hail := ReadHailLine(line)
|
||||
t.Logf("%+v\n", hail)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestSecondPointIsInFuture(t *testing.T) {
|
||||
lines := `19, 13, 30 @ -2, 1, -2
|
||||
18, 19, 22 @ -1, -1, -2
|
||||
20, 25, 34 @ -2, -2, -4
|
||||
12, 31, 28 @ -1, -2, -1
|
||||
20, 19, 15 @ 1, -5, -3`
|
||||
|
||||
for _, line := range strings.Split(lines, "\n") {
|
||||
hail := ReadHailLine(line)
|
||||
t.Log(hail)
|
||||
t.Logf("calced seconds point %+v is in future %t\n", hail.p1, hail.PointInFuture(hail.p1))
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestIntersectExampleOne(t *testing.T) {
|
||||
// Hailstone A: 19, 13, 30 @ -2, 1, -2
|
||||
// Hailstone B: 18, 19, 22 @ -1, -1, -2
|
||||
// Hailstones' paths will cross inside the test area (at x=14.333, y=15.333).
|
||||
|
||||
hA := ReadHailLine("19, 13, 30 @ -2, 1, -2")
|
||||
hB := ReadHailLine("18, 19, 22 @ -1, -1, -2")
|
||||
|
||||
x, y, check := IntersectBySlopeAndShift(hA, hB)
|
||||
if !check {
|
||||
panic("should intersect")
|
||||
}
|
||||
t.Logf("got intersection at %f %f", x, y)
|
||||
}
|
||||
|
||||
func TestIntersectExampleTwo(t *testing.T) {
|
||||
// Hailstone A: 18, 19, 22 @ -1, -1, -2
|
||||
// Hailstone B: 20, 25, 34 @ -2, -2, -4
|
||||
hA := ReadHailLine("18, 19, 22 @ -1, -1, -2")
|
||||
hB := ReadHailLine("20, 25, 34 @ -2, -2, -4")
|
||||
|
||||
x, y, check := IntersectBySlopeAndShift(hA, hB)
|
||||
if check {
|
||||
panic("should not intersect")
|
||||
}
|
||||
t.Logf("got intersection at %f %f", x, y)
|
||||
}
|
||||
|
||||
func TestExamplePairwiseChecks(t *testing.T) {
|
||||
filename := "example"
|
||||
hails := ReadHailFile(filename)
|
||||
for _, hail := range hails {
|
||||
t.Log(hail)
|
||||
}
|
||||
|
||||
intersections := CheckPairwiseIntersections(hails)
|
||||
t.Log("counted intersections ", intersections)
|
||||
}
|
||||
12
day24/neverTellMeTheOdds.go
Normal file
12
day24/neverTellMeTheOdds.go
Normal file
@@ -0,0 +1,12 @@
|
||||
package day24
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("hello day 24, i'm getting tired")
|
||||
filenae := "day24/input"
|
||||
hails := ReadHailFile(filenae)
|
||||
return CheckPairwiseIntersections(hails)
|
||||
}
|
||||
118
day24/notes.org
Normal file
118
day24/notes.org
Normal file
@@ -0,0 +1,118 @@
|
||||
#+title: Notes
|
||||
* i want help from math
|
||||
https://math.stackexchange.com/questions/28503/how-to-find-intersection-of-two-lines-in-3d
|
||||
|
||||
'vector parametric form' is exactly what we're getting in the input?
|
||||
* huh and only 'looking forward in time' so solutions with negative t are not important.
|
||||
cooool
|
||||
** i see that speeds are integers, so updated points are integers.
|
||||
maybe i do calculation of every line on every time point?
|
||||
|
||||
and if i do that is there a way to get intersections efficietly?
|
||||
** i'll also need the ends for lines? ends to the line segments.
|
||||
with limits on the x & y by the task
|
||||
|
||||
for example both 7 <= <= 27
|
||||
for input 200000000000000 <= <= 400000000000000
|
||||
|
||||
also. can't we move the coords? maybe not? maybe only for one
|
||||
so, what do i do? to get the ends of the lines?
|
||||
i try to calcluate with both x & y in 2 min\max. then if the other is ok, than that's the ends?
|
||||
wait, what happens when i do x = 7, and x = 27 and y is outside? it means no intesections, i guess
|
||||
or it could be outside from different sides, so not all x are ok, but there's still line there
|
||||
** Using homogeneous coordinates
|
||||
https://en.wikipedia.org/wiki/Line%E2%80%93line_intersection
|
||||
no, i don't understant that
|
||||
** https://en.wikipedia.org/wiki/Line%E2%80%93line_intersection#Given_two_points_on_each_line
|
||||
with 2 points. i guess
|
||||
but also - check if the point in future of the hail, by comparing with speeds?
|
||||
should be easy
|
||||
** and i got wrong result
|
||||
day24 result: 8406
|
||||
** another formula gives
|
||||
day24 result: 8406
|
||||
** another formula
|
||||
12938
|
||||
*
|
||||
* ok, part 2.
|
||||
what if.
|
||||
i start checking t = 0, 1, etc.
|
||||
for each t, i need two points of the two hail lines.
|
||||
|
||||
it would constitute the trajectory.
|
||||
then condition for the solution that all other hail lines will intersect it at some t.
|
||||
so check for intersection (maybe not necessarily in the field?)
|
||||
|
||||
go though lines, if any fail to intersect - continue with t
|
||||
|
||||
if all intersect, find where the rock has to be in time 0
|
||||
|
||||
oh. no.
|
||||
it's not just intersect. it's that the movement of the rock with t would be there at correct time? yuck?
|
||||
|
||||
would there really be more than i line that intersects all of the hail lines?
|
||||
|
||||
i'll just need to also figure out t=0 from other coords.
|
||||
|
||||
i don't like this at all.
|
||||
|
||||
And intersections have to be over (X, Y, Z)
|
||||
** so 'hail mary' approach would be
|
||||
scan first 1k nanoseconds. so already 1M calculations
|
||||
( this is first part of desperation, that at least 2 hails will intercept in first 1k ticks )
|
||||
|
||||
for collision 1, assume HailA is on path.
|
||||
then iterate for all other assumint they are intercepted on t 2 etc ?
|
||||
|
||||
no. the intersections could be on non-integer times?
|
||||
( this would be second part of the 'hail mary' )
|
||||
|
||||
from that i should be able to construct the 'trajectory' line.
|
||||
and then check with all other points - do the intersect?
|
||||
( and check of intersection in future would be nice )
|
||||
|
||||
then if line confirmed, will need to calc for t = 0, t = 1, and get speeds
|
||||
*** not hoping for all integer intersections
|
||||
or what if i will hope for that?
|
||||
let's try?
|
||||
* ok, what if i could do system of equasions?
|
||||
#+begin_src
|
||||
yuck_test.go:12:
|
||||
x + Dx * t0 == 19 + -2 * t0
|
||||
y + Dy * t0 == 13 + 1 * t0
|
||||
z + Dz * t0 == 19 + -2 * t0
|
||||
x + Dx * t1 == 18 + -1 * t1
|
||||
y + Dy * t1 == 19 + -1 * t1
|
||||
z + Dz * t1 == 18 + -2 * t1
|
||||
x + Dx * t2 == 20 + -2 * t2
|
||||
y + Dy * t2 == 25 + -2 * t2
|
||||
z + Dz * t2 == 20 + -4 * t2
|
||||
solve for x, y, z, Dx, Dy, Dz, t1, t2, t3. ti > 0
|
||||
#+end_src
|
||||
|
||||
#+begin_src
|
||||
yuck_test.go:18:
|
||||
x + Dx * t0 == 147847636573416 + 185 * t0
|
||||
y + Dy * t0 == 190826994408605 + 49 * t0
|
||||
z + Dz * t0 == 147847636573416 + 219 * t0
|
||||
x + Dx * t1 == 287509258905812 + -26 * t1
|
||||
y + Dy * t1 == 207449079739538 + 31 * t1
|
||||
z + Dz * t1 == 287509258905812 + 8 * t1
|
||||
x + Dx * t2 == 390970075767404 + -147 * t2
|
||||
y + Dy * t2 == 535711685410735 + -453 * t2
|
||||
z + Dz * t2 == 390970075767404 + -149 * t2
|
||||
solve for x, y, z, Dx, Dy, Dz, t1, t2, t3. ti > 0
|
||||
#+end_src
|
||||
* got some solution
|
||||
https://z3prover.github.io/papers/programmingz3.html#sec-intro
|
||||
|
||||
enefedov@LLF33A87M:~/Documents/personal/advent-of-code-2023$ python day24/pythonZ3/forInput.py
|
||||
Solution: [t0 = 666003776903,
|
||||
t2 = 779453185471,
|
||||
t1 = 654152070134,
|
||||
Dz = 18,
|
||||
Dx = 47,
|
||||
Dy = -360,
|
||||
z = 273997500449219,
|
||||
y = 463222539161932,
|
||||
x = 239756157786030]
|
||||
47
day24/pythonZ3/example.py
Normal file
47
day24/pythonZ3/example.py
Normal file
@@ -0,0 +1,47 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
from z3 import *
|
||||
|
||||
s = Solver()
|
||||
|
||||
x = Real('x')
|
||||
Dx = Real('Dx')
|
||||
y = Real('y')
|
||||
Dy = Real('Dy')
|
||||
z = Real('z')
|
||||
Dz = Real('Dz')
|
||||
t0 = Real('t0')
|
||||
eqT0 = t0 >= 0
|
||||
eq0x = x + Dx * t0 == (-2 * t0) + 19
|
||||
eq0y = y + Dy * t0 == (1 * t0) + 13
|
||||
eq0z = z + Dz * t0 == (-2 * t0) + 30
|
||||
t1 = Real('t1')
|
||||
eqT1 = t1 >= 0
|
||||
eq1x = x + Dx * t1 == (-1 * t1) + 18
|
||||
eq1y = y + Dy * t1 == (-1 * t1) + 19
|
||||
eq1z = z + Dz * t1 == (-2 * t1) + 22
|
||||
t2 = Real('t2')
|
||||
eqT2 = t2 >= 0
|
||||
eq2x = x + Dx * t2 == (-2 * t2) + 20
|
||||
eq2y = y + Dy * t2 == (-2 * t2) + 25
|
||||
eq2z = z + Dz * t2 == (-4 * t2) + 34
|
||||
#solve for x, y, z, Dx, Dy, Dz, t1, t2, t3.
|
||||
|
||||
|
||||
s.add(eqT0,
|
||||
eq0x,
|
||||
eq0y,
|
||||
eq0z,
|
||||
eqT1,
|
||||
eq1x,
|
||||
eq1y,
|
||||
eq1z,
|
||||
eqT2,
|
||||
eq2x,
|
||||
eq2y,
|
||||
eq2z)
|
||||
|
||||
if s.check() == sat:
|
||||
print("Solution:", s.model())
|
||||
else:
|
||||
print("No solution found")
|
||||
49
day24/pythonZ3/forInput.py
Normal file
49
day24/pythonZ3/forInput.py
Normal file
@@ -0,0 +1,49 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
from z3 import *
|
||||
|
||||
s = Solver()
|
||||
|
||||
x = Real('x')
|
||||
Dx = Real('Dx')
|
||||
y = Real('y')
|
||||
Dy = Real('Dy')
|
||||
z = Real('z')
|
||||
Dz = Real('Dz')
|
||||
t0 = Real('t0')
|
||||
eqT0 = t0 >= 0
|
||||
eq0x = x + Dx * t0 == (185 * t0) + 147847636573416
|
||||
eq0y = y + Dy * t0 == (49 * t0) + 190826994408605
|
||||
eq0z = z + Dz * t0 == (219 * t0) + 140130741291716
|
||||
t1 = Real('t1')
|
||||
eqT1 = t1 >= 0
|
||||
eq1x = x + Dx * t1 == (-26 * t1) + 287509258905812
|
||||
eq1y = y + Dy * t1 == (31 * t1) + 207449079739538
|
||||
eq1z = z + Dz * t1 == (8 * t1) + 280539021150559
|
||||
t2 = Real('t2')
|
||||
eqT2 = t2 >= 0
|
||||
eq2x = x + Dx * t2 == (-147 * t2) + 390970075767404
|
||||
eq2y = y + Dy * t2 == (-453 * t2) + 535711685410735
|
||||
eq2z = z + Dz * t2 == (-149 * t2) + 404166182422876
|
||||
#solve for x, y, z, Dx, Dy, Dz, t1, t2, t3.
|
||||
|
||||
|
||||
s.add(eqT0,
|
||||
eq0x,
|
||||
eq0y,
|
||||
eq0z,
|
||||
eqT1,
|
||||
eq1x,
|
||||
eq1y,
|
||||
eq1z,
|
||||
eqT2,
|
||||
eq2x,
|
||||
eq2y,
|
||||
eq2z)
|
||||
|
||||
if s.check() == sat:
|
||||
print("Solution:", s.model())
|
||||
else:
|
||||
print("No solution found")
|
||||
|
||||
print(273997500449219 + 463222539161932 + 239756157786030)
|
||||
17
day24/pythonZ3/practice.py
Normal file
17
day24/pythonZ3/practice.py
Normal file
@@ -0,0 +1,17 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
from z3 import *
|
||||
|
||||
x = Real('x')
|
||||
y = Real('y')
|
||||
|
||||
eq1 = x + y == 5
|
||||
eq2 = x - y == 3
|
||||
|
||||
s = Solver()
|
||||
s.add(eq1, eq2)
|
||||
|
||||
if s.check() == sat:
|
||||
print("Solution:", s.model())
|
||||
else:
|
||||
print("No solution found")
|
||||
78
day24/yuck.go
Normal file
78
day24/yuck.go
Normal file
@@ -0,0 +1,78 @@
|
||||
package day24
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
func SystemsWithSymbols() (result string) {
|
||||
result += "\n"
|
||||
coords := []string{"x", "y", "z"}
|
||||
for i := 0; i < 3; i++ {
|
||||
for _, coord := range coords {
|
||||
result += fmt.Sprintf("%s + D%s * t%d == %s%d + D%s%d * t%d\n",
|
||||
coord, coord, i, coord, i, coord, i, i)
|
||||
}
|
||||
}
|
||||
|
||||
result += "solve for x, y, z, Dx, Dy, Dz, t1, t2, t3. ti > 0"
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func SystemFromThreeHailstones(hails []HailParam) (result string) {
|
||||
result += "\n"
|
||||
coords := []string{"x", "y", "z"}
|
||||
for i := 0; i < 3; i++ {
|
||||
result += fmt.Sprintf("t%d >= 0\n", i)
|
||||
hailIter := hails[i]
|
||||
for _, coord := range coords {
|
||||
result += fmt.Sprintf("%s + D%s * t%d == %d + %d * t%d\n",
|
||||
coord, coord, i,
|
||||
hailIter.GetCoord(coord), hailIter.GetSpeedOf(coord), i)
|
||||
}
|
||||
}
|
||||
|
||||
result += "solve for x, y, z, Dx, Dy, Dz, t1, t2, t3."
|
||||
return
|
||||
}
|
||||
|
||||
func SystemFromThreeHailstonesToTheLeft(hails []HailParam) (result string) {
|
||||
result += "\n"
|
||||
coords := []string{"x", "y", "z"}
|
||||
for i := 0; i < 3; i++ {
|
||||
result += fmt.Sprintf("t%d >= 0\n", i)
|
||||
hailIter := hails[i]
|
||||
for _, coord := range coords {
|
||||
result += fmt.Sprintf("%s + D%s * t%d - (%d * t%d) == %d \n",
|
||||
coord, coord, i,
|
||||
hailIter.GetSpeedOf(coord), i, hailIter.GetCoord(coord))
|
||||
}
|
||||
}
|
||||
|
||||
result += "solve for x, y, z, Dx, Dy, Dz, t1, t2, t3."
|
||||
return
|
||||
}
|
||||
|
||||
func SystemAsPythonInit(hails []HailParam) (result string) {
|
||||
result += "\n"
|
||||
coords := []string{"x", "y", "z"}
|
||||
for _, coord := range coords {
|
||||
result += fmt.Sprintf("%s = Real('%s')\n", coord, coord)
|
||||
result += fmt.Sprintf("D%s = Real('D%s')\n", coord, coord)
|
||||
}
|
||||
|
||||
for i := 0; i < 3; i++ {
|
||||
result += fmt.Sprintf("t%d = Real('t%d')\n", i, i)
|
||||
result += fmt.Sprintf("eqT%d = t%d >= 0\n", i, i)
|
||||
hailIter := hails[i]
|
||||
for _, coord := range coords {
|
||||
result += fmt.Sprintf("eq%d%s = %s + D%s * t%d == (%d * t%d) + %d \n",
|
||||
i, coord,
|
||||
coord, coord, i,
|
||||
hailIter.GetSpeedOf(coord), i, hailIter.GetCoord(coord))
|
||||
}
|
||||
}
|
||||
|
||||
result += "//solve for x, y, z, Dx, Dy, Dz, t1, t2, t3."
|
||||
return
|
||||
}
|
||||
19
day24/yuck_test.go
Normal file
19
day24/yuck_test.go
Normal file
@@ -0,0 +1,19 @@
|
||||
package day24
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestPrintJustSymbol(t *testing.T) {
|
||||
t.Log(SystemsWithSymbols())
|
||||
}
|
||||
|
||||
func TestPrintSystemExample(t *testing.T) {
|
||||
filename := "example"
|
||||
hails := ReadHailFile(filename)
|
||||
t.Log(SystemAsPythonInit(hails))
|
||||
}
|
||||
|
||||
func TestPrintSystemInput(t *testing.T) {
|
||||
filename := "input"
|
||||
hails := ReadHailFile(filename)
|
||||
t.Log(SystemAsPythonInit(hails))
|
||||
}
|
||||
10
day25/Snowerload.go
Normal file
10
day25/Snowerload.go
Normal file
@@ -0,0 +1,10 @@
|
||||
package day25
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
func Run() int {
|
||||
fmt.Println("time to wrap things up")
|
||||
return 0
|
||||
}
|
||||
1440
day25/after-removing-cycles.mmd
Normal file
1440
day25/after-removing-cycles.mmd
Normal file
File diff suppressed because it is too large
Load Diff
13
day25/example
Normal file
13
day25/example
Normal file
@@ -0,0 +1,13 @@
|
||||
jqt: rhn xhk nvd
|
||||
rsh: frs pzl lsr
|
||||
xhk: hfx
|
||||
cmg: qnr nvd lhk bvb
|
||||
rhn: xhk bvb hfx
|
||||
bvb: xhk hfx
|
||||
pzl: lsr hfx nvd
|
||||
qnr: nvd
|
||||
ntq: jqt hfx bvb xhk
|
||||
nvd: lhk
|
||||
lsr: lhk
|
||||
rzs: qnr cmg lsr rsh
|
||||
frs: qnr lhk lsr
|
||||
15
day25/example-after-removing.mmd
Normal file
15
day25/example-after-removing.mmd
Normal file
@@ -0,0 +1,15 @@
|
||||
flowchart TD
|
||||
cmg --- qnr
|
||||
cmg --- lhk
|
||||
jqt --- nvd
|
||||
bvb --- rhn
|
||||
lsr --- pzl
|
||||
lhk --- lsr
|
||||
lsr --- rsh
|
||||
hfx --- ntq
|
||||
qnr --- rzs
|
||||
bvb --- hfx
|
||||
lhk --- nvd
|
||||
frs --- qnr
|
||||
jqt --- ntq
|
||||
rhn --- xhk
|
||||
34
day25/example-before-removing.mmd
Normal file
34
day25/example-before-removing.mmd
Normal file
@@ -0,0 +1,34 @@
|
||||
flowchart TD
|
||||
ntq --- xhk
|
||||
bvb --- hfx
|
||||
lsr --- pzl
|
||||
nvd --- pzl
|
||||
pzl --- rsh
|
||||
frs --- lhk
|
||||
bvb --- cmg
|
||||
jqt --- xhk
|
||||
bvb --- rhn
|
||||
jqt --- rhn
|
||||
hfx --- xhk
|
||||
frs --- lsr
|
||||
lhk --- lsr
|
||||
jqt --- nvd
|
||||
cmg --- rzs
|
||||
hfx --- pzl
|
||||
bvb --- xhk
|
||||
rhn --- xhk
|
||||
frs --- rsh
|
||||
cmg --- qnr
|
||||
nvd --- qnr
|
||||
qnr --- rzs
|
||||
bvb --- ntq
|
||||
frs --- qnr
|
||||
cmg --- nvd
|
||||
hfx --- rhn
|
||||
jqt --- ntq
|
||||
hfx --- ntq
|
||||
lsr --- rsh
|
||||
cmg --- lhk
|
||||
rsh --- rzs
|
||||
lhk --- nvd
|
||||
lsr --- rzs
|
||||
34
day25/example-graph.mmd
Normal file
34
day25/example-graph.mmd
Normal file
@@ -0,0 +1,34 @@
|
||||
flowchart TD
|
||||
hfx --- pzl
|
||||
cmg --- lhk
|
||||
lsr --- rsh
|
||||
rsh --- rzs
|
||||
bvb --- rhn
|
||||
jqt --- xhk
|
||||
nvd --- pzl
|
||||
lsr --- pzl
|
||||
frs --- qnr
|
||||
frs --- lsr
|
||||
lhk --- lsr
|
||||
lsr --- rzs
|
||||
rhn --- xhk
|
||||
hfx --- ntq
|
||||
nvd --- qnr
|
||||
qnr --- rzs
|
||||
bvb --- xhk
|
||||
hfx --- xhk
|
||||
jqt --- rhn
|
||||
jqt --- nvd
|
||||
cmg --- nvd
|
||||
lhk --- nvd
|
||||
frs --- rsh
|
||||
ntq --- xhk
|
||||
cmg --- rzs
|
||||
bvb --- ntq
|
||||
cmg --- qnr
|
||||
hfx --- rhn
|
||||
jqt --- ntq
|
||||
bvb --- cmg
|
||||
frs --- lhk
|
||||
pzl --- rsh
|
||||
bvb --- hfx
|
||||
1
day25/example-graph.mmd.svg
Normal file
1
day25/example-graph.mmd.svg
Normal file
File diff suppressed because one or more lines are too long
|
After Width: | Height: | Size: 37 KiB |
3
day25/example2
Normal file
3
day25/example2
Normal file
@@ -0,0 +1,3 @@
|
||||
jqt: rhn nvd rsh
|
||||
rsh: frs pzl lsr
|
||||
xhk: hfx
|
||||
301
day25/graph.go
Normal file
301
day25/graph.go
Normal file
@@ -0,0 +1,301 @@
|
||||
package day25
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
mapset "github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
type Graph struct {
|
||||
Nodes map[string]*Node
|
||||
}
|
||||
|
||||
type Node struct {
|
||||
Name string
|
||||
Neighbors mapset.Set[string]
|
||||
}
|
||||
|
||||
func (n Node) String() string {
|
||||
return fmt.Sprintf("[%s : %+v]", n.Name, n.Neighbors)
|
||||
}
|
||||
|
||||
func ReadGraphFile(filename string) (g Graph) {
|
||||
g.Nodes = map[string]*Node{}
|
||||
|
||||
bytes, err := os.ReadFile(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
text := strings.TrimSpace(string(bytes))
|
||||
for _, line := range strings.Split(text, "\n") {
|
||||
g.readGraphLine(line)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph) readGraphLine(l string) {
|
||||
firstSplit := strings.Split(l, ":")
|
||||
|
||||
node, exists := g.Nodes[firstSplit[0]]
|
||||
if !exists {
|
||||
node = &Node{
|
||||
Name: firstSplit[0],
|
||||
Neighbors: mapset.NewSet[string](),
|
||||
}
|
||||
}
|
||||
|
||||
secondSplit := strings.Fields(firstSplit[1])
|
||||
|
||||
for _, neighborName := range secondSplit {
|
||||
neighbor, exists := g.Nodes[neighborName]
|
||||
if !exists {
|
||||
neighbor = &Node{
|
||||
Name: neighborName,
|
||||
Neighbors: mapset.NewSet[string](),
|
||||
}
|
||||
g.Nodes[neighborName] = neighbor
|
||||
}
|
||||
neighbor.Neighbors.Add(node.Name)
|
||||
node.Neighbors.Add(neighbor.Name)
|
||||
}
|
||||
|
||||
g.Nodes[node.Name] = node
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
// NOTE this is so sad. nodeA.Neighbors.Remove(nodeB.Name) hangs for a reason i don't understand
|
||||
func (g *Graph) RemoveEdge(a, b string) {
|
||||
// log.Printf("entering remove edge for %s and %s", a, b)
|
||||
nodeA, existsA := g.Nodes[a]
|
||||
// log.Println("got first node", nodeA, existsA)
|
||||
nodeB, existsB := g.Nodes[b]
|
||||
// log.Println("got second node", nodeB, existsB)
|
||||
if !existsA || !existsB {
|
||||
panic("requesting not found node")
|
||||
}
|
||||
|
||||
// log.Println("before removals")
|
||||
// log.Println("before remove first", nodeA)
|
||||
// nodeA.Neighbors = newANeighbors
|
||||
nodeA.Neighbors = nodeA.Neighbors.Difference(mapset.NewSet[string](nodeB.Name))
|
||||
// nodeA.Neighbors.Remove(nodeB.Name)
|
||||
// log.Println("removed first", nodeA)
|
||||
|
||||
// log.Println("before remove second", nodeB)
|
||||
// nodeB.Neighbors = newBNeighbors
|
||||
nodeB.Neighbors = nodeB.Neighbors.Difference(mapset.NewSet[string](nodeA.Name))
|
||||
// nodeB.Neighbors.Remove(nodeA.Name)
|
||||
// log.Println("removed second", nodeB)
|
||||
}
|
||||
|
||||
func (g *Graph) AddEdge(a, b string) {
|
||||
nodeA, existsA := g.Nodes[a]
|
||||
nodeB, existsB := g.Nodes[b]
|
||||
if !existsA || !existsB {
|
||||
panic("requesting not found node")
|
||||
}
|
||||
nodeA.Neighbors.Add(nodeB.Name)
|
||||
nodeB.Neighbors.Add(nodeA.Name)
|
||||
}
|
||||
|
||||
func (g *Graph) findCycle() (from, to string, exists bool) {
|
||||
// log.Printf(">>>> starting new find cycle")
|
||||
var firstNode *Node
|
||||
for _, n := range g.Nodes {
|
||||
firstNode = n
|
||||
break
|
||||
}
|
||||
// log.Printf("initial search from %s and neighbors %+v", firstNode.Name, firstNode.Neighbors)
|
||||
for neighborName := range firstNode.Neighbors.Iter() {
|
||||
initialVisited := mapset.NewSet[string](firstNode.Name)
|
||||
// log.Printf("initial dfs from %s to %s with initial visited %+v", firstNode.Name, neighborName, initialVisited)
|
||||
from, to, exists = g.dfcCycle(firstNode.Name, neighborName, initialVisited)
|
||||
if exists {
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// log.Printf("<<<< cycle %t, from %s to %s", exists, from, to)
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph) dfcCycle(fromName, atName string, visited mapset.Set[string]) (cycleFrom, cycleTo string, cycleExists bool) {
|
||||
// log.Printf("> step from %+v to %+v. visited : %+v", fromName, atName, visited)
|
||||
if visited.Cardinality() == len(g.Nodes) {
|
||||
log.Println("exit by visited all")
|
||||
return
|
||||
}
|
||||
|
||||
atNode := g.Nodes[atName]
|
||||
|
||||
if visited.Contains(atName) {
|
||||
return fromName, atName, true
|
||||
}
|
||||
|
||||
for neighborName := range atNode.Neighbors.Iter() {
|
||||
if neighborName == fromName {
|
||||
continue
|
||||
}
|
||||
newVisited := visited.Clone()
|
||||
newVisited.Add(atName)
|
||||
cycleFrom, cycleTo, cycleExists = g.dfcCycle(atName, neighborName, newVisited)
|
||||
if cycleExists {
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph) ComponentFrom(fromName string) (component mapset.Set[string]) {
|
||||
startNode := g.Nodes[fromName]
|
||||
component = mapset.NewSet[string](startNode.Name)
|
||||
toVisit := startNode.Neighbors.Clone()
|
||||
|
||||
for toVisit.Cardinality() > 0 {
|
||||
runnerNodeName, _ := toVisit.Pop()
|
||||
if component.Contains(runnerNodeName) {
|
||||
continue
|
||||
}
|
||||
component.Add(runnerNodeName)
|
||||
runnerNode := g.Nodes[runnerNodeName]
|
||||
unvisitedNeighbors := runnerNode.Neighbors.Difference(component)
|
||||
// log.Printf("adding %s to component. neighbors %+v, adding %+v to visit",
|
||||
// runnerNodeName, runnerNode.Neighbors, unvisitedNeighbors)
|
||||
toVisit = toVisit.Union(unvisitedNeighbors)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph) ToMermaid() (result string) {
|
||||
result += "flowchart TD\n"
|
||||
edges := mapset.NewSet[string]()
|
||||
|
||||
for _, node := range g.Nodes {
|
||||
for neighborName := range node.Neighbors.Iter() {
|
||||
var first, second string
|
||||
if node.Name < neighborName {
|
||||
first = node.Name
|
||||
second = neighborName
|
||||
} else {
|
||||
first = neighborName
|
||||
second = node.Name
|
||||
}
|
||||
edges.Add(fmt.Sprintf("\t%s --- %s\n", first, second))
|
||||
}
|
||||
}
|
||||
|
||||
for line := range edges.Iter() {
|
||||
result += line
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph)SaveAsMermaid(filename string) {
|
||||
mmd := g.ToMermaid()
|
||||
|
||||
file, err := os.Create(filename)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
defer func() {
|
||||
if err := file.Close(); err != nil {
|
||||
panic(err)
|
||||
}
|
||||
}()
|
||||
|
||||
file.WriteString(mmd)
|
||||
}
|
||||
|
||||
type Edge struct {
|
||||
smaller, bigger string
|
||||
}
|
||||
func (e Edge)String() string {
|
||||
return fmt.Sprintf("%s/%s", e.smaller, e.bigger)
|
||||
}
|
||||
|
||||
func CreateEdge(a, b string) Edge {
|
||||
var smaller, bigger string
|
||||
if a < b {
|
||||
smaller = a
|
||||
bigger = b
|
||||
} else {
|
||||
smaller = b
|
||||
bigger = a
|
||||
}
|
||||
return Edge{smaller, bigger}
|
||||
}
|
||||
|
||||
func (g *Graph) RemoveAllCycles() (removedEdges mapset.Set[Edge]) {
|
||||
removedEdges = mapset.NewSet[Edge]()
|
||||
hasCycle := true
|
||||
var from, to string
|
||||
for hasCycle {
|
||||
from, to, hasCycle = g.findCycle()
|
||||
if hasCycle {
|
||||
// log.Printf("\n!!!! found cycle %s to %s\n", from, to)
|
||||
edgeToRemove := CreateEdge(from, to)
|
||||
removedEdges.Add(edgeToRemove)
|
||||
g.RemoveEdge(from, to)
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (g *Graph) TryToSplit() (componentSizeMult int) {
|
||||
// first remove all cycles
|
||||
removedEdges := g.RemoveAllCycles()
|
||||
g.SaveAsMermaid("after-removing-cycles.mmd")
|
||||
// log.Printf("all removed edges %+v, two of them are necessary to split initial graph into 2 ", removedEdges)
|
||||
|
||||
triedEdges := mapset.NewSet[Edge]()
|
||||
|
||||
for _, node := range g.Nodes {
|
||||
for neighborName := range node.Neighbors.Iter() {
|
||||
edge := CreateEdge(neighborName, node.Name)
|
||||
if triedEdges.Contains(edge) {
|
||||
continue
|
||||
}
|
||||
triedEdges.Add(edge)
|
||||
// first remove the edge
|
||||
g.RemoveEdge(edge.bigger, edge.smaller)
|
||||
|
||||
// then ask for components of the nodes of removed edge
|
||||
compA := g.ComponentFrom(edge.bigger)
|
||||
compB := g.ComponentFrom(edge.smaller)
|
||||
|
||||
// iterate over the initially removed edges. only two of them should be 'connecting'
|
||||
// i.e were necessary to remove
|
||||
necessaryEdgesCount := 0
|
||||
for initiallyRemovedEdge := range removedEdges.Iter() {
|
||||
endA, endB := initiallyRemovedEdge.bigger, initiallyRemovedEdge.smaller
|
||||
isNonNecessary := (compA.Contains(endA) && compA.Contains(endB)) || (compB.Contains(endA) && compB.Contains(endB))
|
||||
if !isNonNecessary {
|
||||
// log.Printf("with edge %+v test removed, the %+v also seems necessary", edge, initiallyRemovedEdge)
|
||||
necessaryEdgesCount += 1
|
||||
}
|
||||
}
|
||||
// log.Printf("with edge %+v test removed neessary count is %d", edge, necessaryEdgesCount)
|
||||
|
||||
// if we found 2 necessary, then our currently tried edge is the third necesary to remove
|
||||
// and out two components are the searched
|
||||
if necessaryEdgesCount == 2 {
|
||||
return compA.Cardinality() * compB.Cardinality()
|
||||
}
|
||||
|
||||
// in the end add edge back if not fitting
|
||||
g.AddEdge(edge.bigger, edge.smaller)
|
||||
}
|
||||
}
|
||||
|
||||
// now huh. if we didn't find `necessaryEdgesCount == 2`
|
||||
// that means 0, 1 or 3
|
||||
|
||||
return
|
||||
}
|
||||
96
day25/graph_test.go
Normal file
96
day25/graph_test.go
Normal file
@@ -0,0 +1,96 @@
|
||||
package day25
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
mapset "github.com/deckarep/golang-set/v2"
|
||||
)
|
||||
|
||||
func TestReadFileExample(t *testing.T) {
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
t.Logf("read graph %+v", g)
|
||||
}
|
||||
|
||||
func TestRemoveEdge(t *testing.T) {
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
t.Logf("read graph %+v", g)
|
||||
|
||||
g.RemoveEdge("bvb", "hfx")
|
||||
t.Logf("after removing bvb-hfv %+v", g)
|
||||
}
|
||||
|
||||
func TestCreateExampleMermaid(t *testing.T) {
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
|
||||
g.SaveAsMermaid("example-graph.mmd")
|
||||
}
|
||||
|
||||
func TestComponentOnInitial(t *testing.T) {
|
||||
// should be all nodes
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
comp := g.ComponentFrom("bvb")
|
||||
t.Logf("got component %+v", comp)
|
||||
if comp.Cardinality() != len(g.Nodes) {
|
||||
t.Errorf("should have same size!")
|
||||
}
|
||||
}
|
||||
|
||||
func TestComponentOnMini(t *testing.T) {
|
||||
// should be all nodes
|
||||
filename := "example2"
|
||||
g := ReadGraphFile(filename)
|
||||
comp := g.ComponentFrom("jqt")
|
||||
t.Logf("got component %+v", comp)
|
||||
if comp.Cardinality() == len(g.Nodes) {
|
||||
t.Errorf("should have different size!")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRemoveAllCycles(t *testing.T) {
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
g.SaveAsMermaid("example-before-removing.mmd")
|
||||
t.Logf("initial graph is %+v", g)
|
||||
edges := g.RemoveAllCycles()
|
||||
expectedNecessary := mapset.NewSet[Edge](
|
||||
CreateEdge("hfx", "pzl"),
|
||||
CreateEdge("bvb", "cmg"),
|
||||
CreateEdge("nvd", "jqt"),
|
||||
)
|
||||
|
||||
intersection := expectedNecessary.Intersect(edges)
|
||||
t.Logf("i expect that exactly two will be in intersection %+v", intersection)
|
||||
if intersection.Cardinality() != 2 {
|
||||
panic("huh?")
|
||||
// ok, this is not what i expected.
|
||||
// this is unstable. but i could run it several times? and hopefully luck out?
|
||||
}
|
||||
|
||||
t.Logf("removed edges %+v", edges)
|
||||
t.Logf("after removal graph is %+v", g)
|
||||
g.SaveAsMermaid("example-after-removing.mmd")
|
||||
}
|
||||
|
||||
func TestSplittingExample(t *testing.T) {
|
||||
filename := "example"
|
||||
g := ReadGraphFile(filename)
|
||||
result := g.TryToSplit()
|
||||
t.Logf("hopefully same as example answer: %d", result)
|
||||
}
|
||||
|
||||
func TestSplittingInput(t *testing.T) {
|
||||
// kind of brute force
|
||||
result := 0
|
||||
filename := "input"
|
||||
|
||||
for result == 0 {
|
||||
g := ReadGraphFile(filename)
|
||||
result = g.TryToSplit()
|
||||
t.Logf("hopefully as answer: %d", result)
|
||||
}
|
||||
|
||||
}
|
||||
480
day25/notes.org
Normal file
480
day25/notes.org
Normal file
@@ -0,0 +1,480 @@
|
||||
#+title: Notes
|
||||
* ok, not so simple
|
||||
'how to find 3 edges which if removed split graph into 2 components'
|
||||
|
||||
i guess i could find all cycles?
|
||||
and then what?
|
||||
|
||||
then if i have graph without cycles and all of the removed edges,
|
||||
maybe i can figure out how to remove one edge, to make two components, and all (k - 2) edges back in a way that would not connect those components.
|
||||
|
||||
well, the cycles maybe should have been broken elsewhere?
|
||||
for most cycles - it doesn't matter.
|
||||
|
||||
for cycles that break into 2 components, does it?
|
||||
|
||||
ok. if i have the no-cycle graph.
|
||||
then we can maybe start searching for 'third' edge to remove.
|
||||
|
||||
i remove it, i get two components.
|
||||
then as a test - exactly two edges when added should connect those components.
|
||||
if that happens - i have an answer
|
||||
|
||||
sounds good
|
||||
** making code to get components.
|
||||
now.
|
||||
** make method to remove all cycles.
|
||||
save the removed edges to slice
|
||||
|
||||
i guess could be a method that modifies the graph and returns all removed edges.
|
||||
and test that
|
||||
** why a node ends up without any edges?
|
||||
#+begin_src
|
||||
=== RUN TestRemoveAllCycles
|
||||
graph_test.go:55: initial graph is {Nodes:map[bvb:[bvb : Set{xhk, hfx, ntq}] cmg:[cmg : Set{qnr, nvd, lhk, bvb, rzs}] frs:[frs : Set{qnr, lhk, lsr}] hfx:[hfx : Set{rhn, bvb, pzl, ntq, xhk}] jqt:[jqt : Set{rhn, xhk, nvd, ntq}] lhk:[lhk : Set{nvd, lsr, frs, cmg}] lsr:[lsr : Set{rzs, frs, lhk}] ntq:[ntq : Set{jqt, hfx, bvb, xhk}] nvd:[nvd : Set{lhk}] pzl:[pzl : Set{lsr, hfx, nvd}] qnr:[qnr : Set{nvd, rzs, frs}] rhn:[rhn : Set{bvb, hfx, xhk}] rsh:[rsh : Set{lsr, rzs, frs, pzl}] rzs:[rzs : Set{lsr, rsh, qnr, cmg}] xhk:[xhk : Set{hfx, rhn, bvb, ntq}]]}
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from lsr and neighbors Set{lhk, rzs, frs}
|
||||
2023/12/25 08:14:02 initial dfs from lsr to lhk with initial visited Set{lsr}
|
||||
2023/12/25 08:14:02 > step from lsr to lhk. visited : Set{lsr}
|
||||
2023/12/25 08:14:02 > step from lhk to frs. visited : Set{lsr, lhk}
|
||||
2023/12/25 08:14:02 > step from frs to lsr. visited : Set{lhk, frs, lsr}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from frs to lsr
|
||||
2023/12/25 08:14:02 entering remove edge for frs and lsr
|
||||
2023/12/25 08:14:02 removed first [frs : Set{qnr, lhk}]
|
||||
2023/12/25 08:14:02 removed second [lsr : Set{lhk, rzs}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from xhk and neighbors Set{hfx, rhn, bvb, ntq}
|
||||
2023/12/25 08:14:02 initial dfs from xhk to ntq with initial visited Set{xhk}
|
||||
2023/12/25 08:14:02 > step from xhk to ntq. visited : Set{xhk}
|
||||
2023/12/25 08:14:02 > step from ntq to jqt. visited : Set{xhk, ntq}
|
||||
2023/12/25 08:14:02 > step from jqt to rhn. visited : Set{xhk, ntq, jqt}
|
||||
2023/12/25 08:14:02 > step from rhn to bvb. visited : Set{rhn, xhk, ntq, jqt}
|
||||
2023/12/25 08:14:02 > step from bvb to xhk. visited : Set{xhk, ntq, jqt, rhn, bvb}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from bvb to xhk
|
||||
2023/12/25 08:14:02 entering remove edge for bvb and xhk
|
||||
2023/12/25 08:14:02 removed first [bvb : Set{hfx, ntq}]
|
||||
2023/12/25 08:14:02 removed second [xhk : Set{ntq, hfx, rhn}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rhn and neighbors Set{bvb, hfx, xhk}
|
||||
2023/12/25 08:14:02 initial dfs from rhn to xhk with initial visited Set{rhn}
|
||||
2023/12/25 08:14:02 > step from rhn to xhk. visited : Set{rhn}
|
||||
2023/12/25 08:14:02 > step from xhk to hfx. visited : Set{rhn, xhk}
|
||||
2023/12/25 08:14:02 > step from hfx to bvb. visited : Set{rhn, xhk, hfx}
|
||||
2023/12/25 08:14:02 > step from bvb to hfx. visited : Set{hfx, rhn, xhk, bvb}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from bvb to hfx
|
||||
2023/12/25 08:14:02 entering remove edge for bvb and hfx
|
||||
2023/12/25 08:14:02 removed first [bvb : Set{ntq}]
|
||||
2023/12/25 08:14:02 removed second [hfx : Set{rhn, pzl, ntq, xhk}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rsh and neighbors Set{frs, pzl, lsr, rzs}
|
||||
2023/12/25 08:14:02 initial dfs from rsh to frs with initial visited Set{rsh}
|
||||
2023/12/25 08:14:02 > step from rsh to frs. visited : Set{rsh}
|
||||
2023/12/25 08:14:02 > step from frs to qnr. visited : Set{rsh, frs}
|
||||
2023/12/25 08:14:02 > step from qnr to nvd. visited : Set{qnr, rsh, frs}
|
||||
2023/12/25 08:14:02 > step from nvd to lhk. visited : Set{rsh, frs, qnr, nvd}
|
||||
2023/12/25 08:14:02 > step from lhk to cmg. visited : Set{rsh, frs, qnr, nvd, lhk}
|
||||
2023/12/25 08:14:02 > step from cmg to qnr. visited : Set{rsh, frs, qnr, nvd, lhk, cmg}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from cmg to qnr
|
||||
2023/12/25 08:14:02 entering remove edge for cmg and qnr
|
||||
2023/12/25 08:14:02 removed first [cmg : Set{nvd, lhk, bvb, rzs}]
|
||||
2023/12/25 08:14:02 removed second [qnr : Set{nvd, rzs, frs}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rzs and neighbors Set{rsh, qnr, cmg, lsr}
|
||||
2023/12/25 08:14:02 initial dfs from rzs to cmg with initial visited Set{rzs}
|
||||
2023/12/25 08:14:02 > step from rzs to cmg. visited : Set{rzs}
|
||||
2023/12/25 08:14:02 > step from cmg to nvd. visited : Set{rzs, cmg}
|
||||
2023/12/25 08:14:02 > step from nvd to lhk. visited : Set{cmg, nvd, rzs}
|
||||
2023/12/25 08:14:02 > step from lhk to cmg. visited : Set{cmg, lhk, nvd, rzs}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from lhk to cmg
|
||||
2023/12/25 08:14:02 entering remove edge for lhk and cmg
|
||||
2023/12/25 08:14:02 removed first [lhk : Set{frs, nvd, lsr}]
|
||||
2023/12/25 08:14:02 removed second [cmg : Set{rzs, nvd, bvb}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from qnr and neighbors Set{nvd, rzs, frs}
|
||||
2023/12/25 08:14:02 initial dfs from qnr to nvd with initial visited Set{qnr}
|
||||
2023/12/25 08:14:02 > step from qnr to nvd. visited : Set{qnr}
|
||||
2023/12/25 08:14:02 > step from nvd to lhk. visited : Set{qnr, nvd}
|
||||
2023/12/25 08:14:02 > step from lhk to nvd. visited : Set{lhk, qnr, nvd}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from lhk to nvd
|
||||
2023/12/25 08:14:02 entering remove edge for lhk and nvd
|
||||
2023/12/25 08:14:02 removed first [lhk : Set{lsr, frs}]
|
||||
2023/12/25 08:14:02 removed second [nvd : Set{}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from jqt and neighbors Set{rhn, xhk, nvd, ntq}
|
||||
2023/12/25 08:14:02 initial dfs from jqt to xhk with initial visited Set{jqt}
|
||||
2023/12/25 08:14:02 > step from jqt to xhk. visited : Set{jqt}
|
||||
2023/12/25 08:14:02 > step from xhk to ntq. visited : Set{jqt, xhk}
|
||||
2023/12/25 08:14:02 > step from ntq to bvb. visited : Set{jqt, xhk, ntq}
|
||||
2023/12/25 08:14:02 > step from bvb to ntq. visited : Set{jqt, xhk, ntq, bvb}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from bvb to ntq
|
||||
2023/12/25 08:14:02 entering remove edge for bvb and ntq
|
||||
2023/12/25 08:14:02 removed first [bvb : Set{}]
|
||||
2023/12/25 08:14:02 removed second [ntq : Set{jqt, hfx, xhk}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from xhk and neighbors Set{hfx, rhn, ntq}
|
||||
2023/12/25 08:14:02 initial dfs from xhk to hfx with initial visited Set{xhk}
|
||||
2023/12/25 08:14:02 > step from xhk to hfx. visited : Set{xhk}
|
||||
2023/12/25 08:14:02 > step from hfx to xhk. visited : Set{xhk, hfx}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from hfx to xhk
|
||||
2023/12/25 08:14:02 entering remove edge for hfx and xhk
|
||||
2023/12/25 08:14:02 removed first [hfx : Set{ntq, rhn, pzl}]
|
||||
2023/12/25 08:14:02 removed second [xhk : Set{rhn, ntq}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from lsr and neighbors Set{lhk, rzs}
|
||||
2023/12/25 08:14:02 initial dfs from lsr to rzs with initial visited Set{lsr}
|
||||
2023/12/25 08:14:02 > step from lsr to rzs. visited : Set{lsr}
|
||||
2023/12/25 08:14:02 > step from rzs to qnr. visited : Set{lsr, rzs}
|
||||
2023/12/25 08:14:02 > step from qnr to rzs. visited : Set{lsr, rzs, qnr}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from qnr to rzs
|
||||
2023/12/25 08:14:02 entering remove edge for qnr and rzs
|
||||
2023/12/25 08:14:02 removed first [qnr : Set{frs, nvd}]
|
||||
2023/12/25 08:14:02 removed second [rzs : Set{cmg, lsr, rsh}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from pzl and neighbors Set{lsr, hfx, nvd}
|
||||
2023/12/25 08:14:02 initial dfs from pzl to lsr with initial visited Set{pzl}
|
||||
2023/12/25 08:14:02 > step from pzl to lsr. visited : Set{pzl}
|
||||
2023/12/25 08:14:02 > step from lsr to lhk. visited : Set{pzl, lsr}
|
||||
2023/12/25 08:14:02 > step from lhk to lsr. visited : Set{lsr, lhk, pzl}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from lhk to lsr
|
||||
2023/12/25 08:14:02 entering remove edge for lhk and lsr
|
||||
2023/12/25 08:14:02 removed first [lhk : Set{frs}]
|
||||
2023/12/25 08:14:02 removed second [lsr : Set{rzs}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rhn and neighbors Set{xhk, bvb, hfx}
|
||||
2023/12/25 08:14:02 initial dfs from rhn to xhk with initial visited Set{rhn}
|
||||
2023/12/25 08:14:02 > step from rhn to xhk. visited : Set{rhn}
|
||||
2023/12/25 08:14:02 > step from xhk to rhn. visited : Set{xhk, rhn}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from xhk to rhn
|
||||
2023/12/25 08:14:02 entering remove edge for xhk and rhn
|
||||
2023/12/25 08:14:02 removed first [xhk : Set{ntq}]
|
||||
2023/12/25 08:14:02 removed second [rhn : Set{bvb, hfx}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rsh and neighbors Set{frs, pzl, lsr, rzs}
|
||||
2023/12/25 08:14:02 initial dfs from rsh to frs with initial visited Set{rsh}
|
||||
2023/12/25 08:14:02 > step from rsh to frs. visited : Set{rsh}
|
||||
2023/12/25 08:14:02 > step from frs to qnr. visited : Set{rsh, frs}
|
||||
2023/12/25 08:14:02 > step from qnr to frs. visited : Set{rsh, frs, qnr}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from qnr to frs
|
||||
2023/12/25 08:14:02 entering remove edge for qnr and frs
|
||||
2023/12/25 08:14:02 removed first [qnr : Set{nvd}]
|
||||
2023/12/25 08:14:02 removed second [frs : Set{lhk}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rhn and neighbors Set{bvb, hfx}
|
||||
2023/12/25 08:14:02 initial dfs from rhn to hfx with initial visited Set{rhn}
|
||||
2023/12/25 08:14:02 > step from rhn to hfx. visited : Set{rhn}
|
||||
2023/12/25 08:14:02 > step from hfx to rhn. visited : Set{rhn, hfx}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from hfx to rhn
|
||||
2023/12/25 08:14:02 entering remove edge for hfx and rhn
|
||||
2023/12/25 08:14:02 removed first [hfx : Set{pzl, ntq}]
|
||||
2023/12/25 08:14:02 removed second [rhn : Set{bvb}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from pzl and neighbors Set{hfx, nvd, lsr}
|
||||
2023/12/25 08:14:02 initial dfs from pzl to lsr with initial visited Set{pzl}
|
||||
2023/12/25 08:14:02 > step from pzl to lsr. visited : Set{pzl}
|
||||
2023/12/25 08:14:02 > step from lsr to rzs. visited : Set{pzl, lsr}
|
||||
2023/12/25 08:14:02 > step from rzs to cmg. visited : Set{pzl, lsr, rzs}
|
||||
2023/12/25 08:14:02 > step from cmg to rzs. visited : Set{pzl, lsr, rzs, cmg}
|
||||
2023/12/25 08:14:02 <<<< cycle true, from cmg to rzs
|
||||
2023/12/25 08:14:02 entering remove edge for cmg and rzs
|
||||
2023/12/25 08:14:02 removed first [cmg : Set{nvd, bvb}]
|
||||
2023/12/25 08:14:02 removed second [rzs : Set{lsr, rsh}]
|
||||
2023/12/25 08:14:02 >>>> starting new find cycle
|
||||
2023/12/25 08:14:02 initial search from rhn and neighbors Set{bvb}
|
||||
2023/12/25 08:14:02 initial dfs from rhn to bvb with initial visited Set{rhn}
|
||||
2023/12/25 08:14:02 > step from rhn to bvb. visited : Set{rhn}
|
||||
2023/12/25 08:14:02 <<<< cycle false, from to
|
||||
graph_test.go:57: removed edges Set{{lhk nvd}, {hfx xhk}, {frs qnr}, {cmg rzs}, {bvb xhk}, {bvb hfx}, {cmg qnr}, {cmg lhk}, {qnr rzs}, {hfx rhn}, {frs lsr}, {rhn xhk}, {bvb ntq}, {lhk lsr}}
|
||||
graph_test.go:58: after removal graph is {Nodes:map[bvb:[bvb : Set{}] cmg:[cmg : Set{nvd, bvb}] frs:[frs : Set{lhk}] hfx:[hfx : Set{pzl, ntq}] jqt:[jqt : Set{rhn, xhk, nvd, ntq}] lhk:[lhk : Set{frs}] lsr:[lsr : Set{rzs}] ntq:[ntq : Set{xhk, jqt, hfx}] nvd:[nvd : Set{}] pzl:[pzl : Set{nvd, lsr, hfx}] qnr:[qnr : Set{nvd}] rhn:[rhn : Set{bvb}] rsh:[rsh : Set{frs, pzl, lsr, rzs}] rzs:[rzs : Set{lsr, rsh}] xhk:[xhk : Set{ntq}]]}
|
||||
--- PASS: TestRemoveAllCycles (0.00s)
|
||||
|
||||
#+end_src
|
||||
*** because i was overwriting nodes on file read
|
||||
** now why i get empty sets?
|
||||
#+begin_src
|
||||
=== RUN TestRemoveAllCycles
|
||||
graph_test.go:55: initial graph is {Nodes:map[bvb:[bvb : Set{cmg, rhn, xhk, hfx, ntq}] cmg:[cmg : Set{lhk, bvb, rzs, qnr, nvd}] frs:[frs : Set{rsh, qnr, lhk, lsr}] hfx:[hfx : Set{xhk, rhn, bvb, pzl, ntq}] jqt:[jqt : Set{rhn, xhk, nvd, ntq}] lhk:[lhk : Set{nvd, lsr, frs, cmg}] lsr:[lsr : Set{rsh, pzl, lhk, rzs, frs}] ntq:[ntq : Set{jqt, hfx, bvb, xhk}] nvd:[nvd : Set{pzl, qnr, lhk, jqt, cmg}] pzl:[pzl : Set{hfx, nvd, rsh, lsr}] qnr:[qnr : Set{cmg, nvd, rzs, frs}] rhn:[rhn : Set{jqt, xhk, bvb, hfx}] rsh:[rsh : Set{lsr, rzs, frs, pzl}] rzs:[rzs : Set{qnr, cmg, lsr, rsh}] xhk:[xhk : Set{hfx, rhn, bvb, ntq, jqt}]]}
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from xhk and neighbors Set{rhn, bvb, ntq, jqt, hfx}
|
||||
2023/12/25 08:31:09 initial dfs from xhk to rhn with initial visited Set{xhk}
|
||||
2023/12/25 08:31:09 > step from xhk to rhn. visited : Set{xhk}
|
||||
2023/12/25 08:31:09 > step from rhn to jqt. visited : Set{xhk, rhn}
|
||||
2023/12/25 08:31:09 > step from jqt to rhn. visited : Set{xhk, rhn, jqt}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from jqt to rhn
|
||||
2023/12/25 08:31:09 entering remove edge for jqt and rhn
|
||||
2023/12/25 08:31:09 before remove first [jqt : Set{xhk, nvd, ntq, rhn}]
|
||||
2023/12/25 08:31:09 removed first [jqt : Set{ntq, xhk, nvd}]
|
||||
2023/12/25 08:31:09 before remove second [jqt : Set{nvd, ntq, xhk}]
|
||||
2023/12/25 08:31:09 removed second [rhn : Set{hfx, xhk, bvb}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from pzl and neighbors Set{hfx, nvd, rsh, lsr}
|
||||
2023/12/25 08:31:09 initial dfs from pzl to nvd with initial visited Set{pzl}
|
||||
2023/12/25 08:31:09 > step from pzl to nvd. visited : Set{pzl}
|
||||
2023/12/25 08:31:09 > step from nvd to jqt. visited : Set{pzl, nvd}
|
||||
2023/12/25 08:31:09 > step from jqt to ntq. visited : Set{pzl, nvd, jqt}
|
||||
2023/12/25 08:31:09 > step from ntq to xhk. visited : Set{pzl, nvd, jqt, ntq}
|
||||
2023/12/25 08:31:09 > step from xhk to jqt. visited : Set{ntq, pzl, nvd, jqt, xhk}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from xhk to jqt
|
||||
2023/12/25 08:31:09 entering remove edge for xhk and jqt
|
||||
2023/12/25 08:31:09 before remove first [xhk : Set{jqt, hfx, rhn, bvb, ntq}]
|
||||
2023/12/25 08:31:09 removed first [xhk : Set{ntq, hfx, rhn, bvb}]
|
||||
2023/12/25 08:31:09 before remove second [xhk : Set{hfx, rhn, bvb, ntq}]
|
||||
2023/12/25 08:31:09 removed second [jqt : Set{nvd, ntq}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from frs and neighbors Set{rsh, qnr, lhk, lsr}
|
||||
2023/12/25 08:31:09 initial dfs from frs to lhk with initial visited Set{frs}
|
||||
2023/12/25 08:31:09 > step from frs to lhk. visited : Set{frs}
|
||||
2023/12/25 08:31:09 > step from lhk to nvd. visited : Set{frs, lhk}
|
||||
2023/12/25 08:31:09 > step from nvd to cmg. visited : Set{lhk, frs, nvd}
|
||||
2023/12/25 08:31:09 > step from cmg to lhk. visited : Set{frs, nvd, lhk, cmg}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from cmg to lhk
|
||||
2023/12/25 08:31:09 entering remove edge for cmg and lhk
|
||||
2023/12/25 08:31:09 before remove first [cmg : Set{qnr, nvd, lhk, bvb, rzs}]
|
||||
2023/12/25 08:31:09 removed first [cmg : Set{nvd, bvb, rzs, qnr}]
|
||||
2023/12/25 08:31:09 before remove second [cmg : Set{bvb, rzs, qnr, nvd}]
|
||||
2023/12/25 08:31:09 removed second [lhk : Set{nvd, lsr, frs}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from xhk and neighbors Set{hfx, rhn, bvb, ntq}
|
||||
2023/12/25 08:31:09 initial dfs from xhk to ntq with initial visited Set{xhk}
|
||||
2023/12/25 08:31:09 > step from xhk to ntq. visited : Set{xhk}
|
||||
2023/12/25 08:31:09 > step from ntq to jqt. visited : Set{xhk, ntq}
|
||||
2023/12/25 08:31:09 > step from jqt to ntq. visited : Set{xhk, ntq, jqt}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from jqt to ntq
|
||||
2023/12/25 08:31:09 entering remove edge for jqt and ntq
|
||||
2023/12/25 08:31:09 before remove first [jqt : Set{ntq, nvd}]
|
||||
2023/12/25 08:31:09 removed first [jqt : Set{nvd}]
|
||||
2023/12/25 08:31:09 before remove second [jqt : Set{nvd}]
|
||||
2023/12/25 08:31:09 removed second [ntq : Set{xhk, hfx, bvb}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from nvd and neighbors Set{jqt, cmg, pzl, qnr, lhk}
|
||||
2023/12/25 08:31:09 initial dfs from nvd to cmg with initial visited Set{nvd}
|
||||
2023/12/25 08:31:09 > step from nvd to cmg. visited : Set{nvd}
|
||||
2023/12/25 08:31:09 > step from cmg to bvb. visited : Set{nvd, cmg}
|
||||
2023/12/25 08:31:09 > step from bvb to cmg. visited : Set{cmg, nvd, bvb}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from bvb to cmg
|
||||
2023/12/25 08:31:09 entering remove edge for bvb and cmg
|
||||
2023/12/25 08:31:09 before remove first [bvb : Set{cmg, rhn, xhk, hfx, ntq}]
|
||||
2023/12/25 08:31:09 removed first [bvb : Set{xhk, hfx, ntq, rhn}]
|
||||
2023/12/25 08:31:09 before remove second [bvb : Set{rhn, xhk, hfx, ntq}]
|
||||
2023/12/25 08:31:09 removed second [cmg : Set{rzs, qnr, nvd}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from frs and neighbors Set{rsh, qnr, lhk, lsr}
|
||||
2023/12/25 08:31:09 initial dfs from frs to lsr with initial visited Set{frs}
|
||||
2023/12/25 08:31:09 > step from frs to lsr. visited : Set{frs}
|
||||
2023/12/25 08:31:09 > step from lsr to rzs. visited : Set{frs, lsr}
|
||||
2023/12/25 08:31:09 > step from rzs to cmg. visited : Set{frs, lsr, rzs}
|
||||
2023/12/25 08:31:09 > step from cmg to rzs. visited : Set{lsr, rzs, cmg, frs}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from cmg to rzs
|
||||
2023/12/25 08:31:09 entering remove edge for cmg and rzs
|
||||
2023/12/25 08:31:09 before remove first [cmg : Set{rzs, qnr, nvd}]
|
||||
2023/12/25 08:31:09 removed first [cmg : Set{qnr, nvd}]
|
||||
2023/12/25 08:31:09 before remove second [cmg : Set{qnr, nvd}]
|
||||
2023/12/25 08:31:09 removed second [rzs : Set{lsr, rsh, qnr}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from ntq and neighbors Set{xhk, hfx, bvb}
|
||||
2023/12/25 08:31:09 initial dfs from ntq to bvb with initial visited Set{ntq}
|
||||
2023/12/25 08:31:09 > step from ntq to bvb. visited : Set{ntq}
|
||||
2023/12/25 08:31:09 > step from bvb to rhn. visited : Set{ntq, bvb}
|
||||
2023/12/25 08:31:09 > step from rhn to xhk. visited : Set{bvb, ntq, rhn}
|
||||
2023/12/25 08:31:09 > step from xhk to hfx. visited : Set{bvb, ntq, rhn, xhk}
|
||||
2023/12/25 08:31:09 > step from hfx to bvb. visited : Set{xhk, bvb, ntq, rhn, hfx}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from hfx to bvb
|
||||
2023/12/25 08:31:09 entering remove edge for hfx and bvb
|
||||
2023/12/25 08:31:09 before remove first [hfx : Set{rhn, bvb, pzl, ntq, xhk}]
|
||||
2023/12/25 08:31:09 removed first [hfx : Set{ntq, xhk, rhn, pzl}]
|
||||
2023/12/25 08:31:09 before remove second [hfx : Set{pzl, ntq, xhk, rhn}]
|
||||
2023/12/25 08:31:09 removed second [bvb : Set{ntq, rhn, xhk}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from rhn and neighbors Set{xhk, bvb, hfx}
|
||||
2023/12/25 08:31:09 initial dfs from rhn to bvb with initial visited Set{rhn}
|
||||
2023/12/25 08:31:09 > step from rhn to bvb. visited : Set{rhn}
|
||||
2023/12/25 08:31:09 > step from bvb to rhn. visited : Set{bvb, rhn}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from bvb to rhn
|
||||
2023/12/25 08:31:09 entering remove edge for bvb and rhn
|
||||
2023/12/25 08:31:09 before remove first [bvb : Set{ntq, rhn, xhk}]
|
||||
2023/12/25 08:31:09 removed first [bvb : Set{ntq, xhk}]
|
||||
2023/12/25 08:31:09 before remove second [bvb : Set{ntq, xhk}]
|
||||
2023/12/25 08:31:09 removed second [rhn : Set{xhk, hfx}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from rhn and neighbors Set{xhk, hfx}
|
||||
2023/12/25 08:31:09 initial dfs from rhn to hfx with initial visited Set{rhn}
|
||||
2023/12/25 08:31:09 > step from rhn to hfx. visited : Set{rhn}
|
||||
2023/12/25 08:31:09 > step from hfx to pzl. visited : Set{rhn, hfx}
|
||||
2023/12/25 08:31:09 > step from pzl to rsh. visited : Set{rhn, hfx, pzl}
|
||||
2023/12/25 08:31:09 > step from rsh to frs. visited : Set{hfx, pzl, rsh, rhn}
|
||||
2023/12/25 08:31:09 > step from frs to qnr. visited : Set{rsh, frs, rhn, hfx, pzl}
|
||||
2023/12/25 08:31:09 > step from qnr to cmg. visited : Set{rsh, frs, rhn, hfx, pzl, qnr}
|
||||
2023/12/25 08:31:09 > step from cmg to qnr. visited : Set{hfx, pzl, qnr, cmg, rsh, frs, rhn}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from cmg to qnr
|
||||
2023/12/25 08:31:09 entering remove edge for cmg and qnr
|
||||
2023/12/25 08:31:09 before remove first [cmg : Set{qnr, nvd}]
|
||||
2023/12/25 08:31:09 removed first [cmg : Set{nvd}]
|
||||
2023/12/25 08:31:09 before remove second [cmg : Set{nvd}]
|
||||
2023/12/25 08:31:09 removed second [qnr : Set{nvd, rzs, frs}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from lsr and neighbors Set{rsh, pzl, lhk, rzs, frs}
|
||||
2023/12/25 08:31:09 initial dfs from lsr to frs with initial visited Set{lsr}
|
||||
2023/12/25 08:31:09 > step from lsr to frs. visited : Set{lsr}
|
||||
2023/12/25 08:31:09 > step from frs to rsh. visited : Set{lsr, frs}
|
||||
2023/12/25 08:31:09 > step from rsh to frs. visited : Set{lsr, frs, rsh}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rsh to frs
|
||||
2023/12/25 08:31:09 entering remove edge for rsh and frs
|
||||
2023/12/25 08:31:09 before remove first [rsh : Set{frs, pzl, lsr, rzs}]
|
||||
2023/12/25 08:31:09 removed first [rsh : Set{rzs, pzl, lsr}]
|
||||
2023/12/25 08:31:09 before remove second [rsh : Set{rzs, pzl, lsr}]
|
||||
2023/12/25 08:31:09 removed second [frs : Set{qnr, lhk, lsr}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from rhn and neighbors Set{xhk, hfx}
|
||||
2023/12/25 08:31:09 initial dfs from rhn to xhk with initial visited Set{rhn}
|
||||
2023/12/25 08:31:09 > step from rhn to xhk. visited : Set{rhn}
|
||||
2023/12/25 08:31:09 > step from xhk to hfx. visited : Set{rhn, xhk}
|
||||
2023/12/25 08:31:09 > step from hfx to pzl. visited : Set{rhn, xhk, hfx}
|
||||
2023/12/25 08:31:09 > step from pzl to rsh. visited : Set{rhn, xhk, hfx, pzl}
|
||||
2023/12/25 08:31:09 > step from rsh to rzs. visited : Set{rhn, xhk, hfx, pzl, rsh}
|
||||
2023/12/25 08:31:09 > step from rzs to rsh. visited : Set{rzs, rsh, rhn, xhk, hfx, pzl}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rzs to rsh
|
||||
2023/12/25 08:31:09 entering remove edge for rzs and rsh
|
||||
2023/12/25 08:31:09 before remove first [rzs : Set{lsr, rsh, qnr}]
|
||||
2023/12/25 08:31:09 removed first [rzs : Set{qnr, lsr}]
|
||||
2023/12/25 08:31:09 before remove second [rzs : Set{lsr, qnr}]
|
||||
2023/12/25 08:31:09 removed second [rsh : Set{pzl, lsr}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from xhk and neighbors Set{hfx, rhn, bvb, ntq}
|
||||
2023/12/25 08:31:09 initial dfs from xhk to hfx with initial visited Set{xhk}
|
||||
2023/12/25 08:31:09 > step from xhk to hfx. visited : Set{xhk}
|
||||
2023/12/25 08:31:09 > step from hfx to pzl. visited : Set{xhk, hfx}
|
||||
2023/12/25 08:31:09 > step from pzl to rsh. visited : Set{xhk, hfx, pzl}
|
||||
2023/12/25 08:31:09 > step from rsh to pzl. visited : Set{pzl, xhk, hfx, rsh}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rsh to pzl
|
||||
2023/12/25 08:31:09 entering remove edge for rsh and pzl
|
||||
2023/12/25 08:31:09 before remove first [rsh : Set{pzl, lsr}]
|
||||
2023/12/25 08:31:09 removed first [rsh : Set{lsr}]
|
||||
2023/12/25 08:31:09 before remove second [rsh : Set{lsr}]
|
||||
2023/12/25 08:31:09 removed second [pzl : Set{lsr, hfx, nvd}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from rsh and neighbors Set{lsr}
|
||||
2023/12/25 08:31:09 initial dfs from rsh to lsr with initial visited Set{rsh}
|
||||
2023/12/25 08:31:09 > step from rsh to lsr. visited : Set{rsh}
|
||||
2023/12/25 08:31:09 > step from lsr to frs. visited : Set{rsh, lsr}
|
||||
2023/12/25 08:31:09 > step from frs to qnr. visited : Set{frs, lsr, rsh}
|
||||
2023/12/25 08:31:09 > step from qnr to nvd. visited : Set{lsr, rsh, frs, qnr}
|
||||
2023/12/25 08:31:09 > step from nvd to jqt. visited : Set{rsh, frs, qnr, nvd, lsr}
|
||||
2023/12/25 08:31:09 > step from jqt to nvd. visited : Set{lsr, rsh, frs, qnr, nvd, jqt}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from jqt to nvd
|
||||
2023/12/25 08:31:09 entering remove edge for jqt and nvd
|
||||
2023/12/25 08:31:09 before remove first [jqt : Set{nvd}]
|
||||
2023/12/25 08:31:09 removed first [jqt : Set{}]
|
||||
2023/12/25 08:31:09 before remove second [jqt : Set{}]
|
||||
2023/12/25 08:31:09 removed second [nvd : Set{cmg, pzl, qnr, lhk}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from bvb and neighbors Set{ntq, xhk}
|
||||
2023/12/25 08:31:09 initial dfs from bvb to ntq with initial visited Set{bvb}
|
||||
2023/12/25 08:31:09 > step from bvb to ntq. visited : Set{bvb}
|
||||
2023/12/25 08:31:09 > step from ntq to xhk. visited : Set{bvb, ntq}
|
||||
2023/12/25 08:31:09 > step from xhk to hfx. visited : Set{ntq, xhk, bvb}
|
||||
2023/12/25 08:31:09 > step from hfx to pzl. visited : Set{bvb, ntq, xhk, hfx}
|
||||
2023/12/25 08:31:09 > step from pzl to nvd. visited : Set{pzl, bvb, ntq, xhk, hfx}
|
||||
2023/12/25 08:31:09 > step from nvd to qnr. visited : Set{hfx, pzl, nvd, bvb, ntq, xhk}
|
||||
2023/12/25 08:31:09 > step from qnr to nvd. visited : Set{pzl, nvd, bvb, ntq, xhk, hfx, qnr}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from qnr to nvd
|
||||
2023/12/25 08:31:09 entering remove edge for qnr and nvd
|
||||
2023/12/25 08:31:09 before remove first [qnr : Set{nvd, rzs, frs}]
|
||||
2023/12/25 08:31:09 removed first [qnr : Set{rzs, frs}]
|
||||
2023/12/25 08:31:09 before remove second [qnr : Set{frs, rzs}]
|
||||
2023/12/25 08:31:09 removed second [nvd : Set{pzl, lhk, cmg}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from pzl and neighbors Set{lsr, hfx, nvd}
|
||||
2023/12/25 08:31:09 initial dfs from pzl to lsr with initial visited Set{pzl}
|
||||
2023/12/25 08:31:09 > step from pzl to lsr. visited : Set{pzl}
|
||||
2023/12/25 08:31:09 > step from lsr to rsh. visited : Set{pzl, lsr}
|
||||
2023/12/25 08:31:09 > step from rsh to lsr. visited : Set{pzl, lsr, rsh}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rsh to lsr
|
||||
2023/12/25 08:31:09 entering remove edge for rsh and lsr
|
||||
2023/12/25 08:31:09 before remove first [rsh : Set{lsr}]
|
||||
2023/12/25 08:31:09 removed first [rsh : Set{}]
|
||||
2023/12/25 08:31:09 before remove second [rsh : Set{}]
|
||||
2023/12/25 08:31:09 removed second [lsr : Set{rzs, frs, pzl, lhk}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from rhn and neighbors Set{xhk, hfx}
|
||||
2023/12/25 08:31:09 initial dfs from rhn to xhk with initial visited Set{rhn}
|
||||
2023/12/25 08:31:09 > step from rhn to xhk. visited : Set{rhn}
|
||||
2023/12/25 08:31:09 > step from xhk to hfx. visited : Set{rhn, xhk}
|
||||
2023/12/25 08:31:09 > step from hfx to pzl. visited : Set{hfx, rhn, xhk}
|
||||
2023/12/25 08:31:09 > step from pzl to lsr. visited : Set{rhn, xhk, hfx, pzl}
|
||||
2023/12/25 08:31:09 > step from lsr to frs. visited : Set{xhk, hfx, pzl, rhn, lsr}
|
||||
2023/12/25 08:31:09 > step from frs to qnr. visited : Set{rhn, lsr, xhk, hfx, pzl, frs}
|
||||
2023/12/25 08:31:09 > step from qnr to rzs. visited : Set{qnr, rhn, lsr, xhk, hfx, pzl, frs}
|
||||
2023/12/25 08:31:09 > step from rzs to lsr. visited : Set{hfx, rzs, pzl, frs, qnr, rhn, lsr, xhk}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rzs to lsr
|
||||
2023/12/25 08:31:09 entering remove edge for rzs and lsr
|
||||
2023/12/25 08:31:09 before remove first [rzs : Set{lsr, qnr}]
|
||||
2023/12/25 08:31:09 removed first [rzs : Set{qnr}]
|
||||
2023/12/25 08:31:09 before remove second [rzs : Set{qnr}]
|
||||
2023/12/25 08:31:09 removed second [lsr : Set{lhk, frs, pzl}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from frs and neighbors Set{qnr, lhk, lsr}
|
||||
2023/12/25 08:31:09 initial dfs from frs to qnr with initial visited Set{frs}
|
||||
2023/12/25 08:31:09 > step from frs to qnr. visited : Set{frs}
|
||||
2023/12/25 08:31:09 > step from qnr to rzs. visited : Set{frs, qnr}
|
||||
2023/12/25 08:31:09 > step from rzs to qnr. visited : Set{frs, qnr, rzs}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from rzs to qnr
|
||||
2023/12/25 08:31:09 entering remove edge for rzs and qnr
|
||||
2023/12/25 08:31:09 before remove first [rzs : Set{qnr}]
|
||||
2023/12/25 08:31:09 removed first [rzs : Set{}]
|
||||
2023/12/25 08:31:09 before remove second [rzs : Set{}]
|
||||
2023/12/25 08:31:09 removed second [qnr : Set{frs}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from frs and neighbors Set{qnr, lhk, lsr}
|
||||
2023/12/25 08:31:09 initial dfs from frs to qnr with initial visited Set{frs}
|
||||
2023/12/25 08:31:09 > step from frs to qnr. visited : Set{frs}
|
||||
2023/12/25 08:31:09 > step from qnr to frs. visited : Set{frs, qnr}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from qnr to frs
|
||||
2023/12/25 08:31:09 entering remove edge for qnr and frs
|
||||
2023/12/25 08:31:09 before remove first [qnr : Set{frs}]
|
||||
2023/12/25 08:31:09 removed first [qnr : Set{}]
|
||||
2023/12/25 08:31:09 before remove second [qnr : Set{}]
|
||||
2023/12/25 08:31:09 removed second [frs : Set{lsr, lhk}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from frs and neighbors Set{lsr, lhk}
|
||||
2023/12/25 08:31:09 initial dfs from frs to lhk with initial visited Set{frs}
|
||||
2023/12/25 08:31:09 > step from frs to lhk. visited : Set{frs}
|
||||
2023/12/25 08:31:09 > step from lhk to nvd. visited : Set{frs, lhk}
|
||||
2023/12/25 08:31:09 > step from nvd to cmg. visited : Set{lhk, nvd, frs}
|
||||
2023/12/25 08:31:09 > step from cmg to nvd. visited : Set{nvd, cmg, frs, lhk}
|
||||
2023/12/25 08:31:09 <<<< cycle true, from cmg to nvd
|
||||
2023/12/25 08:31:09 entering remove edge for cmg and nvd
|
||||
2023/12/25 08:31:09 before remove first [cmg : Set{nvd}]
|
||||
2023/12/25 08:31:09 removed first [cmg : Set{}]
|
||||
2023/12/25 08:31:09 before remove second [cmg : Set{}]
|
||||
2023/12/25 08:31:09 removed second [nvd : Set{pzl, lhk}]
|
||||
2023/12/25 08:31:09 >>>> starting new find cycle
|
||||
2023/12/25 08:31:09 initial search from cmg and neighbors Set{}
|
||||
2023/12/25 08:31:09 <<<< cycle false, from to
|
||||
graph_test.go:57: removed edges Set{{frs qnr}, {cmg lhk}, {bvb cmg}, {bvb rhn}, {lsr rzs}, {jqt nvd}, {nvd qnr}, {cmg nvd}, {jqt rhn}, {cmg rzs}, {frs rsh}, {rsh rzs}, {jqt ntq}, {cmg qnr}, {lsr rsh}, {jqt xhk}, {bvb hfx}, {pzl rsh}, {qnr rzs}}
|
||||
graph_test.go:58: after removal graph is {Nodes:map[bvb:[bvb : Set{ntq, xhk}] cmg:[cmg : Set{}] frs:[frs : Set{lhk, lsr}] hfx:[hfx : Set{rhn, pzl, ntq, xhk}] jqt:[jqt : Set{}] lhk:[lhk : Set{nvd, lsr, frs}] lsr:[lsr : Set{frs, pzl, lhk}] ntq:[ntq : Set{bvb, xhk, hfx}] nvd:[nvd : Set{pzl, lhk}] pzl:[pzl : Set{lsr, hfx, nvd}] qnr:[qnr : Set{}] rhn:[rhn : Set{xhk, hfx}] rsh:[rsh : Set{}] rzs:[rzs : Set{}] xhk:[xhk : Set{ntq, hfx, rhn, bvb}]]}
|
||||
--- PASS: TestRemoveAllCycles (0.00s)
|
||||
PASS
|
||||
ok sunshine.industries/aoc2023/day25 0.003s
|
||||
|
||||
#+end_src
|
||||
** kind of bruteforce
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 518391
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
graph_test.go:93: hopefully as answer: 518391
|
||||
graph_test.go:93: hopefully as answer: 0
|
||||
140
day3/input
140
day3/input
@@ -1,140 +0,0 @@
|
||||
..........................380.......................143............................108.............630...........425........................
|
||||
....*585..30....217*616..........$...................../....$.................447...........381..................+..........973.............
|
||||
.210......*...............639...541..-........830*...........912..........743*.......................828..671........+......*...............
|
||||
.......760....$..............*........737.*.......949..568.......................=........628.85........&.#..........87...535.....794.......
|
||||
....#......616..........373.999..392......853..........&.........666.......*.....365.............807............@....................*......
|
||||
.680................800*..............684................329....*.......960.186........725........*......&.....631.....700*818..............
|
||||
............-402...........%.........@.........576.............956................../.....*....237....490........................998........
|
||||
........624.........600/...283..906................301....903...........=495.....917..165..193......................................+.......
|
||||
....977.+........................*........*610.....-....................................*.......489*795.......-....@545..915......*......641
|
||||
...-.......123........@........113.....643............117.483../...................961.984..................878...........*........277..@...
|
||||
.............$....787..80....................490......$....../..802.....591...373...*..............228..................848..840............
|
||||
.613.......................810..........740......476.....902............$..........201......%.......$..-.......993......................701.
|
||||
......268.......429.........$..........+....582..........@...538*789......................941...136.....334......*.........508...250........
|
||||
........*...745*.........+......*77.........*......653..................#....27................/......+..........491....#....*...*....192...
|
||||
.......978.............26....957......*...400.........*834.#85.......429............................291................690...69.814....#....
|
||||
...276.....=.571....................65....................................765.......179..720..................460.302.......................
|
||||
...*.....577...*..........................=50..758............/.52@........*..........@..............279........-...........................
|
||||
..821...........676..839...227..................*..........273...........789...42.288...#.......226......................451..592......943..
|
||||
.........*...................*......................................$................*..378.......*..876.497....@..575*.........*.....*.....
|
||||
......524........353........664......781...../108..................124.............407........272.43.#....&..421.......708......../..405....
|
||||
................*....769.............*.............../.........432..........57$..............*...................=...........887.637........
|
||||
................96...*................700..%.......573........*....644..........802......798.897.......456..258.130..182......*.............
|
||||
...........&.........420............#.....369..............121........$....916...*........*...............*.............*637.......*816.....
|
||||
...911...97....990...................557.......................355............#.578........764.872.........80.....*792..........517.........
|
||||
.....*..........*...-609....675.............657......728......%....446....963.........201......%.......232.....576.......116........630.....
|
||||
..801..........352............%..............*..........*422..........*..*........538..*.........*435.*.....................*730.....*......
|
||||
....................&................952..283.....931...............95...617........%..215....174.........139........375...........755.426..
|
||||
..........972.......82...500........*.............*...854.......208...........853%..................=......*...........*..941...........%...
|
||||
....975+.....*.....................58....$..836.899..../.........*...................@682........952........193.......13.*........711.......
|
||||
............757..............45.......453...../.............603.735......155..848..........402..........954..............530...........%....
|
||||
..........................................285...........596...=.............*...............*.......*....#......*681...........*162.517.....
|
||||
.173.589.......892..451......268...........*............../............764...476...........206...860.123.....812.....97*319.139.............
|
||||
.......*............%....981....*288....917....666..............#.........&.......532@...-...............330.........................731....
|
||||
......764.......*........*...................=...+.985...952...498..+473.................98..&..663..414....*...885&...788..../.............
|
||||
.............508.439.....956......799.122..786.............*.............214.................63............346........=......507............
|
||||
...252*780.......................*....*.....................446............*.....900.540.................................510...........899..
|
||||
...........169-......%..@........328.506................573.................367.....*..........646.896...815...691..............%141........
|
||||
.................-.195.606.....&................908......./...67*49...@.........707.....#607......*........*.....=..........................
|
||||
..902..@......742.............706..355&.....953...*..811...............125..490....*241..............+..468..............512....334.........
|
||||
...*....463.......802.260&....................*..711.-...-......................%.................360................811.%.............%....
|
||||
.113............%..*....................%861.999.......458..................334.924..425..406..........184#................712....#.....176.
|
||||
.........166...689.124......*550.840.......................742................#........%......405...........26.........546*.......665.......
|
||||
............*............159........*...........@879.......*.........48..................987......751.319....*...113........................
|
||||
..........163............................................453...524.....$..........*936...*...........*......119..+.......756.......&...326..
|
||||
777*241.......210.974.258.....528..............................$...............559.......921......@....382...............*........147...*...
|
||||
....................*.*....................131*156........./................@......445.........477......*.....47=....875..730..........499..
|
||||
.....682..........545..856............................165..740...831.........368..........851......926...223............*...................
|
||||
....*.............................447..................@...........*..............%...418....*.....*...........838...325....................
|
||||
....722.......71....848...747........*..690....275..........740...16..123..129...837.....+.359.68..426......=....-........791...617..582....
|
||||
.............*............=...271...358.*.........*...................+.......*................*............378............$................
|
||||
570.......620..399*666...................702.....32...............215..........896.325......377...................=286.................200..
|
||||
...*833...................................................12/.247......510.........*...369........813*..........*...........................
|
||||
....................................%..........-......&..........*655....*.......380....*.............778.....98.910.......752..............
|
||||
.............643.783..............621....777..523.....384..$...........693...............336................................#..+............
|
||||
521.........*.......*.................$...................261.31..............549.................102......467-.......219......286....$118..
|
||||
..........837........904..636.298.204..368.......47............/.................*...........246...*....#.......526......*..................
|
||||
.....201......497............*....*..............*................982........+..243.............*.774....488.......*..641...........140.....
|
||||
838...*....+.......................296....*765..489....8....................796.......=.......200................996.........364.......*....
|
||||
.......540.93....+...............................................*600.................47..................256........#..........*...+..823..
|
||||
................14...-65..................328.............................407+..............54.464..990....*...305..129.......679..141......
|
||||
.........................487.-.....831.......*.....407.776.409.....75.............426..782...*...*..#......347....*....................864..
|
||||
....995...................=...117.../..*..531.........*......$.......*973...........*...@......614................194........172...930*.....
|
||||
.......*386...........................799.....660...................................415....../.....%370...289............988................
|
||||
.....-..........835+..316.47..............226......904.........789..........................32............$....616..........=...............
|
||||
...243.....26............*........479%....=....@...=..............*......#.....292..............389.....@.....*...................+...$546..
|
||||
..........*.....&.......................$....199......@........&..25......345.....*36...626.....*.....632..108..69*362...&561..517..........
|
||||
.136*291...858...815........974.......913............786....908......495......805..........*122.948.........................................
|
||||
...........................*...............103*363...............438...........*..@.@...........................%401.......90=......403.....
|
||||
.313&.618....@...$521.....332............................768.......*.........428.50..445.338.....*.......................%.....115.*........
|
||||
.......*.....583..............586...........=..............*.......11.......................=.171.280.........140..750...798...&....286..111
|
||||
.....604..........266......................566....-.....398..238.........499........#...@......................*.....%......................
|
||||
.........545*739.$............68..337*.........296..............*................694...100......&..............233..........................
|
||||
.....320................468....-........997......................796...48.....................219.....&............378.552...8..............
|
||||
.......$..........356+....*................*.........205................/........645*....600.........709.....................*...760........
|
||||
...625..................681........870-...907.......-....142..173...859..............802...*.....................741..289*74.............705
|
||||
.........244......908....................................*.../........*..................501......92.............*..........................
|
||||
............*......$.....576.384..111.119.............127..............720....898....534............*....347........................487*....
|
||||
....610...922...........$.....*...*..........381*36.........................../.......*......665.843.......*....754.470..690................
|
||||
.....*.........$..135.......142.242.352..............................775...-.....*....67....*.........+...576......*.....#..................
|
||||
.....249....735......*.......................................402-...*....903..667.419......526........294...................................
|
||||
.361.....-......418.659.575......803.596......910....367...........950.....................................308........*.....................
|
||||
...*......690..*...........*797.................*.......*....................160..........61.....#........*..........141.....402...599..811.
|
||||
...505..............288%........525%...........355...324.................876*...............*538..39....482..696............@.........*.....
|
||||
...............401........@............884..........................&.......................................*.........749.............86....
|
||||
.........484......*38..370........328.*............553......&845.....662................520................46................24.............
|
||||
.....407*..........................$..310..........*.....................765..672............832.......................@.657....555../..588.
|
||||
..........107.789..200..213.....................898..545........67...........%................./...........101.742...422..%...=...&.622.....
|
||||
.............*......*............*292......946......&..............243+.651....178...665............./.........*.............92.............
|
||||
......974...........814.......220......906......$......*.....879...........*......+.....*........-...995........185.18...........254..198...
|
||||
...........791..808.....984...............&...985...871.678..*.............340................973..........253.......#..................*...
|
||||
..../.....*........=....*..........92*411.....................835...................%.................#.../................$.........279....
|
||||
....926....292...........839...333..........600.115....................=....&.....336.....350........966......999...540..908.............705
|
||||
................................*...497...........*....994..........447......131.........*...................*..............................
|
||||
...........$.....-....../.....394...=.......175.417....*................................719...................992....................-......
|
||||
..........960.698.....610.986.........*776..........516...34......&.....75...751...$........%.......651.....=......300...........364.905....
|
||||
...317.......................*..+...........................*....783.....*..=......655.......310.....+...339..........*......256*........145
|
||||
....&..107..................276.136..905......548..%.......334.........471...................................279.727..696...................
|
||||
343...*......733........515..........$....357....@.301.........*348...................118.........&...439...*.......................922.....
|
||||
....754.....-......646.@.................-..................929.................475......*......785......*..684..............27......*......
|
||||
...................*.....463..../59.609...........168*863...................801...&..211..382..........347............28.............473....
|
||||
.................725.....#...........*......79..........................978*...........*......................199.....*.........579.........
|
||||
...........................157*662....958...*...............546@.................&..181...416.109..............*....258.....421*............
|
||||
...859......777=.............................217.862..................817*597...655........*.........76.....443.....................292.....
|
||||
...@...170..........273...........+..=.............................=................217....253..438....*....................................
|
||||
.......&.......................971..876...............205.....394.502..122...........$.............$...40............802....................
|
||||
..894........*.623...718..................216........*.........=.......................544.....................102............932.951..615..
|
||||
....+......865........*...............................298.721................147.......*.........165.741*794...*......318...&.......*...*...
|
||||
...................878..........................*.........../............947...........314.......*...........999............493.....996.330.
|
||||
...........705.........79........994.645.703.203.258...........706...227*..........+............590....822*.................................
|
||||
.....468..*......*311./...@23...$.....@....................588...*.........317......113....857.............19.#..&260...571...251=....772...
|
||||
......*..11...175..........................................*....259...509.....*...........*.......443..487....73........................*...
|
||||
.....161..............................................832...611..........*609..742.........938......*.*............@909.....66.232..........
|
||||
..........125...............630.....596.......%477.......*........................................601..........158............*......201*864
|
||||
...................819.......=..-.........................198..............=..........418...60............904...............................
|
||||
781...663..250...................281....905.+...187....%..............271..852...........=..*...484...517*..................................
|
||||
.......*......=....865.826...%............&.679..*.....160..576.....................462+...717.............&........389........783....67....
|
||||
.......988............*......79..441............474.........#.....471.......298................./.....*511.445......*......463*.............
|
||||
...........91....807.............*......173.............#27.................*......937.353.....85..510..............109.#............70.....
|
||||
...........*.......*...........955..398...@......75.........*...........32...221../....*....7..........234../481........841.........@.......
|
||||
........#..593......864.....................932.*........960........160*.............953........385...+.....................................
|
||||
.702*..596.........................$....952*.....20.........................................257....*........741.858.......*......839..491...
|
||||
................#........618.....693...............................333*87......173......268*......775..................699.710...*..........
|
||||
..........985..586...818..*..........*723.......437*..........148...................333........................156.............821.....651..
|
||||
.....................#...982......360.......757.....706.122....&............245........-........16.......571.......544................$.....
|
||||
6...381......................657..............#...................112*.....*.....496......388.&....556#.....*428..*................-.....828
|
||||
........#.398.688.900@..674....*....284.860..........-....................588.............*...47.........@........280............86.........
|
||||
.....526.....*.............*...........*.....154......267..960......563.......29.......398........309#....849./........240*685..............
|
||||
.......................692.420.978#.........*.....261.....%........+.........*.....492......$957..............979..................254.691..
|
||||
...978#...131*980.......+.................925.....*...627......519....368*..324...*..................$..................52............*.....
|
||||
....................453........400$............................*................745...+..........-..838..#....118.........=.....815.........
|
||||
....679......796.........920........$...75...290..195.......307...340.....649..........403......165.....26.......*..........711.*...........
|
||||
....*.........%..100.440.........206....$...*....*...................*254..........990..............249..........380....-.....%.535.........
|
||||
..........988.....*...*...................399..723.....*....................107......*.....................49...........422.................
|
||||
...800*......*..183..117..375......375..............807.691........981................890.......622...97.....$.515................@.........
|
||||
.......413.129...........*.............729....&137...........956....$.....976....................*....../..........................334......
|
||||
...313..........794-..807.....*698.....&...........874*.........*...............71........................655...............................
|
||||
.....*.....................193............*547.........459.....338...489..581.......483..@115..125........*..........220....................
|
||||
....907..361*.....243..834.............987.....149..........@.........*...............*........$.......509..698.142../.....@.=750......4....
|
||||
....................-...*.....................*............611........21..251.&......578........................*.......717.......47#.......
|
||||
.........................610.26.............892...............................299............601............721..729........................
|
||||
204
day4/input
204
day4/input
@@ -1,204 +0,0 @@
|
||||
Card 1: 5 27 94 20 50 7 98 41 67 34 | 34 9 20 90 7 77 44 71 27 12 98 1 79 96 24 51 25 84 67 41 5 13 78 31 26
|
||||
Card 2: 52 14 37 45 82 39 73 67 72 90 | 72 78 37 25 39 68 23 45 73 90 86 2 85 57 80 62 22 26 92 67 82 95 66 14 52
|
||||
Card 3: 33 86 80 53 45 32 25 29 84 89 | 89 53 74 49 73 30 25 82 84 44 59 86 32 54 96 41 26 80 68 29 45 33 23 99 17
|
||||
Card 4: 64 25 5 1 46 75 45 55 21 7 | 93 62 21 60 46 44 96 88 12 63 85 91 14 55 68 67 16 74 45 41 75 70 25 36 78
|
||||
Card 5: 10 72 81 60 97 56 90 25 76 82 | 19 8 57 3 86 72 84 76 98 34 56 10 25 70 90 40 65 44 60 81 54 93 97 17 82
|
||||
Card 6: 94 58 92 25 7 19 52 82 85 64 | 52 22 82 25 3 92 83 86 35 7 95 58 93 19 48 64 57 43 28 45 94 8 63 79 85
|
||||
Card 7: 3 82 59 79 99 28 96 75 68 74 | 27 49 68 75 89 88 61 74 47 62 82 59 46 26 65 79 96 28 80 29 3 33 35 31 99
|
||||
Card 8: 87 46 34 69 59 44 35 93 11 25 | 72 34 43 86 19 30 9 55 64 81 37 29 7 1 63 68 90 40 45 89 36 98 99 96 11
|
||||
Card 9: 42 99 98 17 88 78 27 10 71 1 | 81 63 4 15 36 48 71 54 78 95 99 1 27 37 10 39 2 62 35 66 33 77 16 29 26
|
||||
Card 10: 60 55 21 8 12 78 11 25 76 56 | 67 4 21 26 55 7 16 6 17 41 44 76 61 15 8 80 88 42 89 57 53 59 13 10 63
|
||||
Card 11: 93 41 81 32 89 5 67 87 28 54 | 69 75 26 86 77 39 47 71 85 57 31 40 22 33 45 49 1 60 25 56 83 11 52 82 59
|
||||
Card 12: 28 85 16 9 21 12 62 20 90 98 | 28 10 36 80 33 12 35 15 90 48 50 49 5 14 11 19 59 99 94 52 84 21 85 20 69
|
||||
Card 13: 21 60 32 18 44 62 63 2 38 78 | 99 95 9 70 46 25 55 86 3 67 89 12 58 81 29 80 75 17 38 49 69 77 51 60 74
|
||||
Card 14: 58 41 11 80 75 40 3 71 35 59 | 46 1 30 56 81 59 85 45 66 31 64 58 65 75 80 62 48 67 4 52 68 24 28 33 5
|
||||
Card 15: 92 32 53 69 42 39 14 86 60 1 | 60 75 45 84 66 72 64 27 5 59 99 4 91 30 28 62 31 44 11 46 23 57 43 63 71
|
||||
Card 16: 72 9 84 64 5 39 19 52 98 22 | 53 68 16 54 87 99 4 96 93 88 10 3 37 74 20 49 77 63 95 11 69 56 36 75 23
|
||||
Card 17: 50 37 64 41 39 4 58 75 24 40 | 16 84 48 69 8 78 33 2 79 19 47 43 44 67 56 89 91 86 20 10 21 23 59 14 6
|
||||
Card 18: 52 92 27 42 83 17 53 84 15 70 | 2 19 64 75 65 80 71 74 66 90 78 24 94 28 48 46 21 32 51 35 95 82 37 81 67
|
||||
Card 19: 97 10 20 67 77 79 55 58 34 12 | 27 47 89 25 75 46 58 11 48 41 42 45 37 18 88 6 97 95 66 32 12 67 34 64 53
|
||||
Card 20: 2 52 6 56 53 23 16 82 17 41 | 8 81 31 23 36 28 80 9 72 42 86 27 49 24 56 53 41 75 2 16 55 17 35 13 63
|
||||
Card 21: 79 38 97 58 91 51 33 15 47 37 | 7 22 39 71 75 60 30 19 11 5 20 81 42 13 63 49 70 17 61 96 43 80 98 40 41
|
||||
Card 22: 81 56 17 83 59 96 40 4 39 78 | 32 52 43 87 85 36 33 81 55 69 31 57 67 3 8 1 38 40 35 79 22 75 26 51 53
|
||||
Card 23: 2 99 96 90 58 27 83 97 75 23 | 90 79 75 77 72 13 7 23 74 59 97 88 29 22 67 99 51 58 27 96 38 2 60 83 11
|
||||
Card 24: 15 56 61 39 34 31 51 58 14 90 | 83 50 57 90 68 42 58 34 38 75 85 15 51 56 73 72 39 84 65 31 46 60 47 16 92
|
||||
Card 25: 13 56 14 57 55 76 24 11 87 96 | 89 57 75 43 12 52 14 5 78 79 27 40 87 76 86 83 22 24 72 96 88 11 45 7 55
|
||||
Card 26: 46 68 86 42 12 59 53 69 51 39 | 5 79 53 80 96 24 43 3 94 10 1 48 89 18 91 50 36 69 7 12 65 86 25 98 56
|
||||
Card 27: 49 99 63 73 83 29 51 16 75 53 | 66 1 44 31 56 49 50 97 32 99 16 59 94 53 71 23 45 47 98 65 78 80 73 34 95
|
||||
Card 28: 37 6 63 95 46 99 47 71 10 21 | 47 29 84 43 2 38 48 90 20 14 70 8 31 37 3 94 49 73 18 54 25 36 33 59 80
|
||||
Card 29: 45 2 93 70 3 94 76 88 50 9 | 94 26 47 97 58 85 79 18 4 7 12 31 90 23 35 6 17 24 20 15 86 10 37 46 9
|
||||
Card 30: 73 20 65 67 75 78 43 82 55 21 | 8 71 70 66 32 49 19 47 83 16 91 20 57 95 4 2 61 84 68 17 31 52 6 60 43
|
||||
Card 31: 98 67 59 51 90 42 3 83 78 17 | 64 93 26 18 23 31 52 66 15 63 10 2 37 22 88 41 40 54 57 60 92 53 75 67 71
|
||||
Card 32: 40 23 35 69 73 77 32 74 43 42 | 3 46 16 59 95 80 21 26 36 15 71 14 91 11 85 75 81 17 88 60 65 66 18 98 19
|
||||
Card 33: 57 34 20 23 11 59 82 47 26 42 | 2 6 50 1 15 28 27 37 25 13 7 31 52 46 53 65 45 16 61 86 69 94 99 71 62
|
||||
Card 34: 95 66 98 53 25 88 84 61 68 17 | 17 25 68 60 21 43 88 86 87 53 12 16 95 46 66 98 72 4 23 71 61 5 28 90 84
|
||||
Card 35: 55 48 52 30 8 95 57 71 10 37 | 40 81 50 37 70 8 72 96 52 77 19 57 55 67 95 26 68 71 2 1 30 44 10 62 14
|
||||
Card 36: 59 91 77 81 22 21 58 88 33 95 | 62 91 78 98 57 59 6 77 20 24 11 63 81 79 22 85 21 26 33 88 99 72 95 58 93
|
||||
Card 37: 70 93 83 97 84 15 34 9 59 3 | 83 60 93 51 67 22 70 97 66 71 64 76 84 55 34 33 7 36 9 37 79 15 59 56 3
|
||||
Card 38: 86 52 85 38 35 22 55 70 13 27 | 97 52 44 38 39 17 85 70 82 36 40 13 28 9 62 56 73 86 29 11 27 80 48 81 92
|
||||
Card 39: 22 58 91 20 78 3 28 54 70 35 | 72 24 39 48 91 19 33 14 10 42 3 28 11 22 58 50 49 78 40 45 83 55 20 7 98
|
||||
Card 40: 22 95 92 62 26 43 63 48 50 53 | 96 50 89 37 23 54 40 76 78 60 49 7 15 18 98 36 26 11 35 93 43 66 75 94 84
|
||||
Card 41: 71 9 44 13 17 55 66 81 6 64 | 25 1 32 44 17 85 9 18 81 71 20 21 15 78 66 75 93 6 59 64 13 55 14 8 91
|
||||
Card 42: 88 46 40 3 83 93 30 8 48 5 | 95 37 5 83 3 24 8 62 38 99 26 30 40 46 10 18 93 15 91 19 88 36 16 48 53
|
||||
Card 43: 52 38 9 87 80 99 75 37 92 50 | 37 72 74 84 39 20 60 57 98 22 95 77 90 83 6 43 58 87 33 80 52 78 3 56 9
|
||||
Card 44: 10 23 22 43 73 91 15 32 27 70 | 20 38 60 59 43 8 23 76 70 45 91 68 32 25 11 72 92 53 33 77 31 10 99 27 64
|
||||
Card 45: 47 26 18 28 43 88 6 62 86 31 | 53 31 17 22 27 59 95 73 26 3 76 61 32 40 55 67 46 9 75 94 6 43 57 28 42
|
||||
Card 46: 24 36 64 28 92 16 1 11 47 81 | 57 82 92 76 4 52 53 20 85 36 29 26 3 84 1 32 35 73 33 2 83 27 50 14 16
|
||||
Card 47: 11 16 9 67 43 84 37 93 50 26 | 19 98 47 75 55 99 32 88 93 94 2 16 73 57 17 89 1 46 71 78 7 61 27 84 8
|
||||
Card 48: 45 19 66 95 5 51 80 60 89 73 | 3 32 13 48 52 73 18 75 68 94 28 59 20 26 46 93 10 27 24 63 61 55 39 43 78
|
||||
Card 49: 89 24 49 90 47 69 75 1 61 23 | 67 18 11 39 55 9 76 13 72 88 69 50 15 4 87 26 73 70 54 43 5 64 84 58 7
|
||||
Card 50: 53 8 14 57 67 1 9 61 13 99 | 54 59 75 20 35 45 15 56 60 51 85 7 25 48 73 11 32 23 41 69 92 14 70 67 44
|
||||
Card 51: 42 58 18 68 12 67 63 32 62 88 | 25 21 19 36 86 57 23 80 79 54 98 10 15 99 93 40 47 51 6 52 94 7 89 26 1
|
||||
Card 52: 64 57 7 52 32 68 70 73 35 8 | 86 79 31 55 85 9 12 97 18 93 6 76 72 63 58 2 41 65 50 47 34 25 81 62 74
|
||||
Card 53: 35 34 51 91 45 49 56 69 94 99 | 1 11 60 49 34 51 30 77 25 35 69 56 43 55 38 99 88 15 12 91 45 95 90 73 94
|
||||
Card 54: 16 1 7 36 13 88 22 5 9 55 | 9 33 39 71 1 10 13 22 74 56 7 30 36 41 21 8 47 5 46 66 79 32 63 55 48
|
||||
Card 55: 38 7 21 53 19 55 2 33 11 36 | 53 44 87 55 71 7 28 73 2 94 42 36 33 16 22 18 79 19 85 12 38 40 89 11 21
|
||||
Card 56: 12 9 60 83 87 34 5 42 86 91 | 68 87 80 39 57 75 19 95 61 51 76 21 74 24 12 83 93 20 97 42 52 69 91 5 34
|
||||
Card 57: 89 63 85 12 96 57 95 60 73 90 | 24 49 65 87 91 12 95 9 50 37 84 67 36 62 47 31 41 54 45 26 64 52 33 79 56
|
||||
Card 58: 69 50 65 36 54 46 60 66 79 53 | 65 79 54 50 92 40 31 73 1 48 46 53 60 67 72 77 63 36 9 69 97 66 39 99 22
|
||||
Card 59: 44 20 70 14 54 12 53 33 40 80 | 44 33 91 17 40 20 14 86 31 70 36 80 83 76 53 12 78 25 54 45 61 52 29 77 81
|
||||
Card 60: 41 71 90 97 53 91 68 52 65 14 | 35 80 83 15 97 41 91 22 54 70 75 14 48 37 90 69 39 68 60 53 65 21 49 42 76
|
||||
Card 61: 74 42 54 38 19 30 73 18 6 67 | 42 71 11 23 57 19 38 17 8 55 20 44 31 76 25 72 13 86 50 21 15 88 54 79 32
|
||||
Card 62: 2 58 94 86 48 38 18 29 59 76 | 28 66 52 64 86 71 34 3 13 53 22 14 69 11 72 54 27 42 24 91 75 62 68 51 57
|
||||
Card 63: 14 47 6 31 72 22 96 86 32 55 | 47 94 45 15 92 50 86 32 6 31 9 19 14 13 76 8 22 72 99 55 96 27 73 82 18
|
||||
Card 64: 83 63 36 68 81 58 95 65 14 2 | 55 72 42 93 20 84 99 62 90 18 48 76 38 96 91 59 88 37 98 23 7 46 60 19 82
|
||||
Card 65: 88 67 40 18 65 30 6 79 57 8 | 17 59 29 66 21 48 57 18 67 23 79 16 90 82 8 30 10 92 86 42 88 53 65 98 60
|
||||
Card 66: 9 51 73 19 96 80 75 87 91 47 | 57 11 82 30 78 8 20 58 88 98 5 87 61 28 95 15 26 73 27 9 14 12 75 76 68
|
||||
Card 67: 27 14 23 65 97 8 61 22 34 38 | 87 40 96 93 64 33 18 41 95 98 47 39 57 20 13 50 55 22 74 85 89 21 28 58 84
|
||||
Card 68: 21 53 58 92 95 29 47 33 77 22 | 6 49 37 51 58 7 94 47 77 13 53 11 17 25 50 33 15 81 56 30 1 16 24 19 85
|
||||
Card 69: 17 53 19 13 35 15 26 54 63 12 | 99 41 34 32 4 87 81 98 15 44 64 69 61 23 58 17 79 80 59 9 57 86 72 54 56
|
||||
Card 70: 8 94 37 56 90 43 96 7 67 76 | 90 30 42 98 3 59 64 92 93 58 52 86 23 49 37 5 34 31 95 6 7 4 74 43 2
|
||||
Card 71: 98 61 90 28 93 85 99 53 66 62 | 57 74 77 69 70 23 75 24 40 11 13 82 35 36 17 14 15 8 50 32 41 62 97 26 58
|
||||
Card 72: 33 83 94 80 69 74 81 65 41 82 | 99 88 35 93 8 9 53 47 14 63 75 60 61 38 36 94 12 15 27 16 77 87 68 56 10
|
||||
Card 73: 98 99 97 64 14 53 3 41 22 50 | 39 32 57 17 77 13 52 90 51 66 4 23 59 79 22 5 67 20 10 86 80 26 93 55 16
|
||||
Card 74: 79 33 10 24 41 95 45 13 86 7 | 29 69 89 16 12 76 36 83 54 34 73 40 57 97 88 66 31 53 50 47 37 84 3 96 81
|
||||
Card 75: 12 2 84 59 72 50 80 22 44 81 | 2 84 30 11 70 54 48 58 85 26 94 16 90 22 65 60 72 40 93 66 14 32 80 88 19
|
||||
Card 76: 15 75 71 49 24 4 13 51 82 89 | 71 74 89 84 68 24 87 41 66 75 13 51 11 39 49 38 44 15 4 67 56 59 82 83 92
|
||||
Card 77: 48 94 85 78 23 64 3 46 53 19 | 69 19 78 3 36 15 74 91 68 4 33 16 96 38 46 53 85 94 23 44 64 42 99 48 6
|
||||
Card 78: 40 92 79 69 9 90 27 6 55 63 | 45 85 36 5 93 70 22 47 57 12 88 4 89 15 26 30 29 77 76 10 84 54 48 17 1
|
||||
Card 79: 82 91 80 99 18 95 59 69 19 78 | 78 15 69 24 23 18 19 97 55 98 53 82 59 99 95 4 92 80 2 43 70 47 93 91 31
|
||||
Card 80: 18 69 14 33 62 65 57 94 83 70 | 28 48 22 97 60 27 3 85 68 52 37 76 46 67 17 65 18 98 89 94 33 78 70 92 11
|
||||
Card 81: 56 21 33 99 86 77 28 80 53 35 | 86 79 52 94 49 74 92 1 35 8 96 91 53 28 78 80 22 85 41 3 98 99 65 2 25
|
||||
Card 82: 83 54 52 46 35 59 77 2 11 78 | 30 98 21 28 77 89 52 59 31 1 75 83 79 33 61 6 8 78 54 2 14 11 35 46 16
|
||||
Card 83: 82 98 3 83 39 46 61 68 91 5 | 79 22 3 34 18 86 98 5 82 93 81 57 67 12 60 83 51 46 27 62 91 33 90 65 63
|
||||
Card 84: 19 2 23 89 53 98 3 48 77 91 | 9 77 57 61 55 2 84 3 14 60 23 48 98 89 64 53 43 91 16 19 7 46 35 44 66
|
||||
Card 85: 74 70 80 22 78 84 48 57 67 75 | 50 42 3 60 80 19 99 70 39 6 74 57 36 25 54 48 29 9 84 37 12 30 44 46 17
|
||||
Card 86: 92 49 27 86 14 67 25 85 10 87 | 86 72 57 68 71 47 17 90 22 29 12 36 20 66 84 91 76 96 44 48 85 34 24 56 39
|
||||
Card 87: 41 21 80 47 77 64 55 13 63 99 | 40 72 77 62 54 13 23 98 73 33 2 3 42 51 65 41 99 80 88 74 55 84 79 21 63
|
||||
Card 88: 87 40 46 5 49 88 60 13 38 29 | 38 37 68 40 4 58 55 87 66 60 20 26 92 10 2 13 90 7 59 29 17 27 49 54 46
|
||||
Card 89: 9 84 31 42 33 25 86 38 37 94 | 37 78 21 31 4 91 42 71 23 30 73 55 75 7 15 5 35 81 19 33 94 63 92 25 80
|
||||
Card 90: 39 38 79 40 96 57 56 90 97 48 | 43 1 61 30 28 80 64 26 50 19 77 23 4 78 33 52 5 58 60 31 95 72 35 91 70
|
||||
Card 91: 7 52 99 74 30 59 68 48 80 14 | 20 2 83 64 1 19 66 15 16 24 50 93 53 88 6 46 13 56 32 82 97 51 67 10 33
|
||||
Card 92: 19 23 1 88 80 22 26 31 76 74 | 60 12 98 28 95 30 38 22 61 16 31 76 21 62 46 15 82 25 41 44 63 58 89 57 37
|
||||
Card 93: 88 94 16 71 8 24 81 77 90 43 | 36 53 17 65 29 34 22 91 43 76 54 64 95 82 73 8 12 9 1 44 93 80 14 55 72
|
||||
Card 94: 92 59 16 51 77 99 41 13 64 72 | 78 52 57 45 75 62 86 90 18 40 58 6 11 51 81 20 38 66 88 68 7 53 34 76 56
|
||||
Card 95: 92 1 56 16 94 38 51 71 53 46 | 85 93 64 44 68 59 20 98 62 67 77 99 23 48 91 10 40 87 13 55 7 34 58 45 15
|
||||
Card 96: 8 2 71 53 58 42 17 56 40 28 | 88 41 68 33 64 32 48 55 62 20 95 76 72 94 11 96 31 63 59 3 79 82 54 69 43
|
||||
Card 97: 47 21 49 98 40 26 92 32 28 85 | 28 65 52 98 45 42 10 18 44 82 38 60 66 21 49 22 85 40 16 32 47 20 27 51 2
|
||||
Card 98: 91 76 77 26 7 47 53 1 82 41 | 26 33 44 37 82 41 75 64 79 95 76 17 86 52 30 97 94 16 61 78 66 42 46 63 34
|
||||
Card 99: 10 72 12 27 38 17 50 57 25 29 | 92 31 53 48 50 57 25 81 39 12 15 2 19 55 43 32 17 29 72 65 27 33 77 38 75
|
||||
Card 100: 43 65 55 94 99 11 93 6 91 88 | 32 55 53 22 83 94 12 96 7 25 48 57 75 93 38 11 23 43 92 91 60 88 4 99 78
|
||||
Card 101: 91 30 63 15 33 87 61 50 64 19 | 52 76 9 44 6 75 53 79 69 11 37 46 74 65 80 47 66 10 13 2 72 3 20 54 32
|
||||
Card 102: 33 71 42 22 29 58 65 67 26 87 | 94 18 67 39 66 46 52 80 22 29 50 86 42 74 25 95 23 87 26 8 53 7 16 33 71
|
||||
Card 103: 42 18 40 16 12 91 28 47 56 34 | 59 66 26 34 57 56 91 52 12 13 7 21 16 83 42 72 18 64 97 41 28 40 47 35 45
|
||||
Card 104: 35 23 73 88 45 16 38 84 60 66 | 2 84 35 1 91 45 67 78 9 73 31 8 95 51 55 38 27 16 88 66 23 17 58 82 60
|
||||
Card 105: 97 13 6 16 98 22 7 15 21 33 | 97 70 96 13 12 18 25 16 30 98 6 22 7 21 17 71 26 78 4 1 82 33 85 15 10
|
||||
Card 106: 99 97 31 59 9 81 80 14 53 35 | 71 35 97 17 88 99 59 24 82 81 21 14 6 33 57 46 69 9 80 31 66 62 77 43 53
|
||||
Card 107: 55 18 42 53 47 62 50 80 15 20 | 92 69 48 24 11 73 62 12 70 20 7 10 71 59 38 75 63 80 4 65 42 67 84 91 66
|
||||
Card 108: 79 3 99 80 45 2 50 95 72 22 | 33 73 6 98 13 34 41 71 47 32 11 27 29 30 63 92 57 51 74 24 2 12 31 59 62
|
||||
Card 109: 26 99 85 49 34 10 63 54 92 19 | 74 31 54 61 33 22 75 37 82 35 34 94 19 98 29 70 57 5 99 96 64 92 42 53 43
|
||||
Card 110: 15 18 92 59 99 34 19 78 20 45 | 44 59 98 12 61 41 90 99 9 92 57 88 15 81 51 95 2 62 30 19 96 7 43 80 45
|
||||
Card 111: 17 88 22 95 77 64 2 21 42 43 | 13 34 64 27 57 76 92 24 43 75 60 74 77 26 80 89 88 14 2 32 42 91 37 79 82
|
||||
Card 112: 50 96 38 27 48 69 29 67 62 6 | 1 15 62 89 67 93 48 44 64 14 9 19 92 12 29 51 20 33 31 26 74 11 27 49 87
|
||||
Card 113: 40 62 52 45 88 24 57 1 12 76 | 69 81 43 41 78 54 10 91 80 87 15 6 8 38 37 34 95 11 76 18 27 70 29 26 75
|
||||
Card 114: 95 70 93 78 28 30 46 50 53 71 | 25 91 10 41 16 33 68 85 82 76 83 21 94 74 26 13 29 47 4 92 5 56 67 55 20
|
||||
Card 115: 46 50 16 33 72 34 51 27 6 97 | 57 91 15 78 84 89 59 41 79 55 14 94 62 28 32 76 25 31 72 56 37 63 64 83 69
|
||||
Card 116: 26 57 31 54 53 22 27 66 34 91 | 93 72 75 36 18 79 46 56 11 23 51 65 35 84 60 28 6 3 25 81 89 58 10 85 30
|
||||
Card 117: 34 60 67 13 62 90 76 41 25 89 | 5 19 46 58 73 61 28 84 98 95 39 15 85 27 47 42 63 87 69 92 31 17 91 72 70
|
||||
Card 118: 73 71 30 16 99 79 4 82 57 78 | 52 82 67 65 17 51 63 71 57 64 31 53 72 44 2 4 30 27 18 81 98 61 22 28 73
|
||||
Card 119: 88 10 85 91 82 94 57 71 34 56 | 53 48 15 56 36 88 61 25 91 94 89 43 95 67 71 58 93 10 51 85 63 34 57 55 82
|
||||
Card 120: 73 11 74 30 65 64 79 2 87 99 | 2 62 73 65 68 99 55 63 38 11 39 15 13 74 10 79 64 57 36 50 30 25 67 94 87
|
||||
Card 121: 82 2 97 58 85 56 78 48 38 99 | 53 93 97 48 38 85 95 75 78 27 56 58 4 6 82 2 40 62 32 99 63 5 20 49 80
|
||||
Card 122: 50 65 62 43 11 68 37 51 48 77 | 40 56 7 5 95 2 88 43 24 62 68 1 11 98 77 78 81 53 70 29 35 76 54 67 48
|
||||
Card 123: 37 11 30 64 67 23 72 91 39 41 | 88 14 55 30 87 44 25 45 72 11 90 21 19 39 91 29 53 67 64 23 46 37 41 80 6
|
||||
Card 124: 83 42 96 57 59 52 6 58 25 16 | 12 19 83 47 65 89 55 82 10 59 68 16 7 52 76 3 92 77 58 48 66 96 53 91 70
|
||||
Card 125: 39 14 78 98 31 87 93 62 9 18 | 62 29 98 27 78 18 47 54 31 52 9 39 43 34 66 14 23 50 93 48 42 90 79 69 19
|
||||
Card 126: 12 80 29 17 51 46 61 1 94 9 | 29 69 85 77 93 23 25 41 8 27 3 9 74 17 83 53 45 71 30 39 98 78 14 67 61
|
||||
Card 127: 40 47 45 95 64 48 75 25 46 81 | 95 87 94 62 75 42 66 27 25 84 93 46 40 71 64 57 1 5 15 21 48 81 55 29 90
|
||||
Card 128: 23 69 80 39 51 2 76 59 48 5 | 44 83 13 66 57 56 64 32 6 94 63 46 61 49 17 59 42 74 19 81 97 7 45 99 58
|
||||
Card 129: 7 40 52 64 28 32 30 73 1 90 | 68 28 77 11 91 73 72 74 33 80 50 63 95 69 2 15 88 35 90 30 21 42 67 26 39
|
||||
Card 130: 28 86 26 91 69 27 78 31 36 38 | 38 57 91 88 64 55 27 74 79 53 29 28 4 44 69 86 67 26 87 71 32 49 17 1 31
|
||||
Card 131: 6 18 56 24 50 10 36 96 49 11 | 58 96 27 31 39 9 8 62 54 33 21 55 22 72 63 73 49 6 38 89 75 18 36 25 3
|
||||
Card 132: 5 67 70 65 76 33 38 14 22 71 | 38 22 91 55 60 34 46 11 54 73 6 45 24 51 95 28 67 26 14 79 7 98 18 84 2
|
||||
Card 133: 97 25 71 76 81 89 60 29 72 57 | 27 3 14 66 75 72 83 6 29 61 71 86 55 99 90 48 20 67 98 42 21 69 30 78 91
|
||||
Card 134: 52 97 35 89 51 71 73 65 5 27 | 4 82 10 39 2 22 5 8 37 53 73 26 68 12 6 95 90 83 43 23 11 31 27 70 86
|
||||
Card 135: 80 92 78 96 7 16 79 93 40 65 | 55 36 17 69 43 85 79 57 40 2 21 35 66 60 98 62 6 23 76 50 48 3 8 37 9
|
||||
Card 136: 68 9 55 86 78 6 7 4 76 43 | 52 81 71 44 70 8 20 34 56 24 13 72 29 78 91 25 12 11 26 51 97 68 84 99 10
|
||||
Card 137: 69 29 5 77 60 26 78 8 24 9 | 97 16 68 82 59 32 37 15 7 43 48 46 79 6 63 66 1 67 40 57 75 74 17 10 81
|
||||
Card 138: 33 58 75 1 43 41 12 51 21 74 | 10 82 56 19 27 14 11 83 15 29 44 94 59 93 81 62 23 53 31 52 80 92 22 79 54
|
||||
Card 139: 92 72 56 86 21 68 79 99 15 25 | 24 60 93 45 42 9 72 99 8 48 25 67 68 83 31 15 56 62 66 86 40 21 92 79 37
|
||||
Card 140: 96 21 85 97 86 37 16 15 44 84 | 86 18 15 98 21 97 43 85 3 31 16 13 69 84 91 94 1 96 59 47 34 37 49 10 44
|
||||
Card 141: 97 75 69 95 37 81 22 71 60 65 | 68 33 83 93 81 21 64 15 54 45 16 58 94 56 65 34 69 60 72 98 18 51 37 5 71
|
||||
Card 142: 12 19 89 22 10 21 36 72 8 66 | 49 87 19 32 10 81 61 3 66 98 58 36 72 12 21 11 8 34 22 77 65 25 82 89 86
|
||||
Card 143: 22 88 25 82 73 86 72 97 5 45 | 91 65 15 79 31 97 72 88 45 22 60 86 75 5 82 12 52 8 4 73 18 56 33 25 99
|
||||
Card 144: 53 64 57 62 39 52 29 58 11 93 | 31 99 11 71 13 42 52 17 39 53 75 64 72 41 62 58 45 83 90 79 4 29 57 93 74
|
||||
Card 145: 78 80 29 81 42 61 9 87 74 4 | 23 82 21 80 73 95 86 81 68 74 87 20 9 92 26 42 11 67 46 4 84 65 41 78 61
|
||||
Card 146: 58 15 80 20 75 60 18 55 22 89 | 40 57 16 65 36 84 49 54 30 4 95 81 29 79 27 87 89 39 97 34 77 72 83 88 91
|
||||
Card 147: 49 81 37 83 15 94 25 61 41 54 | 61 41 57 92 67 17 7 54 95 51 87 46 84 13 60 31 24 14 69 98 32 86 77 94 75
|
||||
Card 148: 53 38 98 41 62 91 80 71 7 19 | 32 47 2 80 53 94 73 45 62 10 38 51 30 83 98 50 12 27 7 91 68 41 92 19 71
|
||||
Card 149: 53 4 21 55 60 84 23 46 75 70 | 66 35 53 92 25 23 75 21 46 84 10 38 26 70 49 13 60 55 4 58 31 51 18 16 94
|
||||
Card 150: 19 76 99 73 40 67 11 71 29 75 | 67 99 42 93 78 96 60 46 75 64 85 10 94 71 43 81 20 50 77 56 40 30 9 19 57
|
||||
Card 151: 86 18 64 56 88 16 92 55 57 4 | 48 12 94 41 65 83 10 46 61 74 82 98 91 6 21 59 9 71 7 90 43 33 40 76 57
|
||||
Card 152: 86 65 15 10 97 80 32 68 69 62 | 91 39 81 67 61 88 23 24 21 6 32 31 77 89 86 64 40 68 41 46 47 37 33 9 44
|
||||
Card 153: 90 73 56 83 21 5 43 61 2 3 | 75 64 50 92 88 15 83 39 41 90 73 94 3 85 87 61 2 18 65 31 21 56 76 72 89
|
||||
Card 154: 27 68 9 51 20 61 42 91 57 5 | 6 17 1 74 60 27 61 28 23 26 87 68 83 48 99 25 50 22 93 21 34 56 72 40 5
|
||||
Card 155: 37 29 63 14 59 77 98 39 54 1 | 95 85 76 31 98 87 1 14 59 22 99 92 88 39 49 54 19 29 80 84 37 20 55 27 7
|
||||
Card 156: 87 39 14 59 27 95 94 19 60 24 | 21 23 2 50 68 27 62 37 26 66 59 67 30 39 76 60 36 89 35 94 73 95 22 10 48
|
||||
Card 157: 43 9 89 71 6 20 7 86 47 50 | 7 45 88 37 5 42 83 12 64 30 20 67 62 54 27 89 51 73 46 65 61 86 19 34 22
|
||||
Card 158: 76 2 28 74 26 77 95 59 34 70 | 41 56 15 91 53 4 9 25 72 42 10 98 84 78 77 24 50 54 96 89 86 62 71 68 8
|
||||
Card 159: 15 29 97 92 87 47 77 5 42 83 | 67 94 35 57 89 64 34 44 52 69 56 25 37 11 60 18 30 33 43 80 78 24 79 92 2
|
||||
Card 160: 77 88 30 64 32 4 59 17 69 28 | 62 39 58 13 60 45 7 26 29 49 96 12 34 3 57 61 17 33 10 68 65 56 2 75 88
|
||||
Card 161: 18 31 21 98 59 23 6 19 39 8 | 80 67 83 68 71 13 72 24 47 49 9 14 44 77 61 4 41 48 12 78 11 90 15 79 64
|
||||
Card 162: 61 40 24 17 74 93 33 59 30 22 | 41 80 74 14 90 15 5 91 34 87 70 3 96 68 16 13 86 65 37 38 47 60 88 10 8
|
||||
Card 163: 79 77 57 65 61 47 70 26 17 73 | 7 88 44 50 29 82 34 93 66 15 89 19 90 36 64 81 8 48 62 53 39 68 55 43 86
|
||||
Card 164: 13 81 91 41 8 99 73 23 30 25 | 12 31 22 4 51 49 23 30 15 32 9 79 87 88 91 6 90 95 1 73 25 43 41 99 46
|
||||
Card 165: 62 95 5 15 98 17 34 80 18 84 | 17 4 24 14 81 74 94 34 87 5 75 62 15 23 49 71 36 84 95 18 98 41 63 13 80
|
||||
Card 166: 80 6 76 69 9 41 56 36 96 34 | 13 68 30 2 24 6 78 74 41 29 44 61 11 48 87 65 33 47 40 99 1 4 14 23 73
|
||||
Card 167: 19 39 50 83 92 1 9 38 97 32 | 6 30 44 56 63 95 34 18 64 26 31 27 47 94 99 10 14 81 62 78 77 16 29 53 39
|
||||
Card 168: 3 5 29 60 62 86 48 22 19 35 | 92 46 66 63 1 57 68 40 49 20 25 47 45 17 18 58 83 98 34 70 52 91 86 38 89
|
||||
Card 169: 29 19 60 14 76 91 77 48 93 86 | 31 22 77 64 61 28 45 38 55 42 46 2 83 48 19 94 58 32 60 84 18 92 89 66 15
|
||||
Card 170: 98 9 24 43 10 36 88 13 97 82 | 34 31 13 93 35 89 4 97 52 11 18 83 51 58 30 10 8 98 38 43 24 36 82 9 28
|
||||
Card 171: 11 74 12 80 48 18 7 92 34 13 | 48 55 65 79 39 9 18 21 20 14 72 58 40 13 37 31 1 53 92 29 88 95 54 35 46
|
||||
Card 172: 73 95 64 94 56 81 89 67 17 26 | 74 16 64 59 40 86 89 34 95 91 26 79 66 49 76 48 72 21 17 81 32 56 38 6 85
|
||||
Card 173: 63 91 88 29 57 59 44 13 96 15 | 96 30 70 75 9 21 94 99 51 41 80 68 59 15 73 91 48 19 78 38 13 17 36 98 49
|
||||
Card 174: 75 27 25 34 41 42 70 96 38 29 | 29 74 78 15 48 18 71 3 96 16 25 66 63 17 19 27 99 54 30 65 13 67 42 89 35
|
||||
Card 175: 15 74 27 36 55 75 94 26 63 42 | 58 91 87 71 80 47 12 36 99 70 20 61 14 17 27 68 21 63 31 56 42 1 78 49 44
|
||||
Card 176: 1 98 51 53 64 77 83 24 34 16 | 25 32 11 88 58 9 37 4 77 43 14 46 21 23 44 35 33 13 49 57 52 73 87 48 10
|
||||
Card 177: 44 22 27 47 61 84 26 95 37 97 | 95 9 49 89 32 83 56 42 54 76 28 14 75 51 93 17 94 85 71 96 44 88 41 59 4
|
||||
Card 178: 40 32 5 93 80 62 92 82 49 25 | 12 90 73 35 85 21 47 48 54 2 28 20 22 86 78 74 24 62 18 77 71 59 82 6 1
|
||||
Card 179: 27 17 67 60 90 70 36 25 1 12 | 16 35 51 91 46 29 17 71 74 33 37 30 69 57 48 20 9 76 80 31 26 59 85 72 44
|
||||
Card 180: 4 64 95 14 20 18 28 5 27 75 | 60 57 85 48 92 61 79 16 41 51 81 30 93 44 66 13 52 32 98 82 25 21 34 56 3
|
||||
Card 181: 74 73 79 94 9 84 63 62 91 35 | 94 77 96 61 83 8 79 69 21 32 84 81 86 98 9 74 90 35 44 43 82 50 62 73 19
|
||||
Card 182: 46 97 7 34 99 64 57 48 33 77 | 34 85 33 3 15 76 52 89 66 97 65 90 86 80 60 72 99 64 82 12 46 41 8 21 61
|
||||
Card 183: 1 60 48 82 6 68 46 96 74 71 | 68 66 82 1 46 48 28 6 69 96 25 13 93 38 71 7 50 55 43 74 79 60 41 47 73
|
||||
Card 184: 18 42 14 51 27 78 53 93 21 69 | 21 39 75 5 51 93 25 84 53 26 69 38 14 27 88 15 40 42 62 78 92 58 31 49 18
|
||||
Card 185: 73 49 30 67 90 65 23 52 18 47 | 54 71 65 10 47 90 49 30 79 67 73 2 75 18 16 31 98 94 78 29 17 66 52 23 1
|
||||
Card 186: 21 7 99 86 16 12 20 53 89 22 | 89 49 56 92 9 18 95 52 74 6 85 15 4 19 55 24 33 10 53 22 28 78 42 38 12
|
||||
Card 187: 73 92 16 74 60 72 19 87 10 59 | 92 23 13 87 72 5 74 15 20 60 59 71 53 79 63 31 73 55 16 54 97 78 10 21 19
|
||||
Card 188: 57 1 2 31 17 52 68 45 88 56 | 68 94 5 1 31 11 44 79 89 56 32 2 45 90 97 17 96 9 57 10 88 14 86 78 52
|
||||
Card 189: 15 31 46 58 50 8 48 96 49 38 | 33 96 2 15 49 78 30 56 13 46 53 8 38 63 31 88 58 48 19 50 85 26 76 99 87
|
||||
Card 190: 65 2 86 57 79 7 89 41 32 50 | 13 94 54 33 3 96 57 77 17 38 1 18 61 43 53 73 24 81 49 10 70 39 51 46 27
|
||||
Card 191: 51 50 25 84 85 23 34 40 88 45 | 68 48 84 62 93 92 34 80 36 42 56 88 26 41 8 17 85 15 70 65 11 23 12 2 3
|
||||
Card 192: 65 57 15 10 60 9 42 36 74 71 | 12 67 56 89 26 24 61 87 50 54 96 47 3 93 49 63 82 75 21 88 8 45 94 19 64
|
||||
Card 193: 19 59 37 75 85 24 56 5 6 28 | 75 15 37 7 21 95 39 77 17 49 72 44 6 16 63 85 59 87 5 83 56 47 19 71 30
|
||||
Card 194: 99 50 96 60 12 48 81 92 9 29 | 48 61 82 88 75 72 12 29 53 28 10 4 80 37 9 96 69 81 25 99 36 50 92 21 60
|
||||
Card 195: 41 36 67 61 12 95 78 63 29 24 | 2 53 16 27 21 48 36 88 98 30 78 86 63 77 62 92 69 41 7 68 70 50 22 5 61
|
||||
Card 196: 79 35 7 64 14 37 90 16 40 9 | 1 18 86 66 32 92 63 36 26 99 82 95 21 46 81 17 75 59 85 27 15 29 83 43 84
|
||||
Card 197: 18 63 11 37 30 96 60 16 61 89 | 88 6 67 10 90 43 66 26 53 55 22 96 27 42 91 80 73 12 37 11 31 5 33 52 23
|
||||
Card 198: 20 73 72 16 31 83 88 36 95 44 | 15 32 92 52 31 49 46 97 65 71 39 16 72 67 50 68 28 8 87 25 11 61 96 43 86
|
||||
Card 199: 71 57 94 28 4 49 25 47 42 44 | 70 86 29 64 37 5 99 74 71 25 10 40 12 87 66 60 33 77 28 75 52 31 79 96 49
|
||||
Card 200: 90 67 89 44 21 45 31 4 92 63 | 13 8 76 70 26 29 74 2 88 47 46 10 25 43 97 65 27 73 16 71 55 40 58 69 66
|
||||
Card 201: 80 70 21 52 81 91 27 61 72 12 | 46 62 18 54 44 77 92 80 38 9 43 76 93 94 14 79 86 58 15 40 31 16 2 33 96
|
||||
Card 202: 96 97 42 50 23 41 81 52 17 28 | 4 55 64 51 1 97 91 93 95 60 33 45 99 83 62 26 86 16 12 2 3 92 87 74 37
|
||||
Card 203: 3 95 82 57 59 23 20 77 49 28 | 60 35 25 96 83 91 47 86 40 73 33 24 12 48 55 67 88 85 16 31 70 32 17 66 97
|
||||
Card 204: 11 9 81 75 39 52 19 96 47 66 | 37 22 70 43 51 72 7 67 50 83 90 23 24 28 57 87 86 13 27 76 94 35 40 17 91
|
||||
@@ -1,27 +0,0 @@
|
||||
4238460975 3150676058 14156194
|
||||
4014738493 2552067322 165315151
|
||||
2782663538 3067003586 60442604
|
||||
718350022 1496692875 242681298
|
||||
0 662267357 48987302
|
||||
73802866 465780476 196486881
|
||||
270289747 736070223 448060275
|
||||
2501821195 4263593575 31373721
|
||||
961031320 0 47107691
|
||||
2448671317 2498917444 53149878
|
||||
2843106142 3164832252 729755546
|
||||
4180053644 2717382473 58407331
|
||||
2533194916 4223942180 39651395
|
||||
48987302 711254659 24815564
|
||||
2037107882 2799681618 267321968
|
||||
2424779503 2775789804 23891814
|
||||
1547175259 1304493961 192198914
|
||||
4252617169 4084472268 19120259
|
||||
1871165319 3918529705 61914957
|
||||
3572861688 1871165319 441876805
|
||||
2758721631 3894587798 23941907
|
||||
2304429850 4103592527 120349653
|
||||
1933080276 3980444662 104027606
|
||||
1008139011 47107691 418672785
|
||||
1426811796 1184130498 120363463
|
||||
4271737428 3127446190 23229868
|
||||
2572846311 2313042124 185875320
|
||||
@@ -1,39 +0,0 @@
|
||||
4260564640 3164238850 33008819
|
||||
2293789713 3286584985 52546193
|
||||
2087002602 2864270962 68938922
|
||||
1297747555 1309838844 89337809
|
||||
3093628267 3842203176 155987450
|
||||
2609276317 3498417185 343785991
|
||||
658125616 1701481170 20754060
|
||||
1593540119 1399176653 128695111
|
||||
2283933279 2244808425 9856434
|
||||
3849705959 3012295008 151943842
|
||||
678879676 1170609407 139229437
|
||||
1009204170 485451665 270016861
|
||||
2155941524 4227835566 67131730
|
||||
4032228982 4161145047 66690519
|
||||
3532191685 2179585888 65222537
|
||||
2953062308 2268317830 85579399
|
||||
2252523457 3339131178 23090374
|
||||
818109113 755468526 17485651
|
||||
4098919501 2254664859 13652971
|
||||
3379699400 2416907530 152492285
|
||||
4293573459 2662947015 1393837
|
||||
835594764 1527871764 173609406
|
||||
2275613831 3422250936 8319448
|
||||
4112572472 2933209884 24098564
|
||||
1279221031 466925141 18526524
|
||||
3038641707 2957308448 54986560
|
||||
397655230 0 260470386
|
||||
2223073254 3392800733 29450203
|
||||
0 772954177 397655230
|
||||
4136671036 3430570384 30346404
|
||||
3597414222 3998190626 162954421
|
||||
2546266016 2353897229 63010301
|
||||
3249615717 3460916788 37500397
|
||||
1387085364 260470386 206454755
|
||||
3287116114 2087002602 92583286
|
||||
3760368643 3197247669 89337316
|
||||
2346335906 2664340852 199930110
|
||||
4167017440 2569399815 93547200
|
||||
4001649801 3362221552 30579181
|
||||
@@ -1,42 +0,0 @@
|
||||
2521168614 3718558727 45222681
|
||||
2372021437 4250929390 44037906
|
||||
2416059343 3070381062 105109271
|
||||
391082070 1490595758 135161830
|
||||
2750033935 3567996322 26024928
|
||||
2631208948 4085216210 118824987
|
||||
1606793146 1161017018 154561777
|
||||
0 27318229 64007187
|
||||
2566391295 3763781408 64817653
|
||||
2205452704 2073181756 31511904
|
||||
2354729618 109736771 15352358
|
||||
526243900 143079078 467881514
|
||||
165490760 1625757588 156087087
|
||||
321577847 610960592 69504223
|
||||
1116662502 1843304861 180171121
|
||||
2173847890 2322708438 31604814
|
||||
64007187 1315578795 101483573
|
||||
1315244978 1417062368 73533390
|
||||
2989245773 2722605383 134588769
|
||||
3800621948 4204041197 46888193
|
||||
3287840442 2372021437 350583946
|
||||
1067967658 12382058 14936171
|
||||
3276570277 3971207241 11270165
|
||||
3152032800 3594021250 124537477
|
||||
1761354923 2023475982 49705774
|
||||
2236964608 680464815 117765010
|
||||
4152595905 3982477406 102738804
|
||||
3847510141 3920328030 50879211
|
||||
3638424388 3429696886 70468591
|
||||
1098672553 125089129 17989949
|
||||
1999021216 798229825 174826674
|
||||
4255334709 3528363735 39632587
|
||||
1296833623 91325416 18411355
|
||||
3708892979 3828599061 91728969
|
||||
1388778368 2104693660 218014778
|
||||
1811060697 973056499 187960519
|
||||
1082903829 2354313252 15768724
|
||||
2776058863 2857194152 213186910
|
||||
3898389352 3175490333 254206553
|
||||
994125414 1781844675 61460186
|
||||
3123834542 3500165477 28198258
|
||||
1055585600 0 12382058
|
||||
@@ -1,32 +0,0 @@
|
||||
3356468240 2934525445 29117552
|
||||
4275689831 4042213712 19277465
|
||||
949730239 1589971986 381295142
|
||||
2205130246 3387543719 106537240
|
||||
2442849314 2188173171 261901063
|
||||
2027919967 875104547 177210279
|
||||
4258838211 4278115676 16851620
|
||||
1969509044 3125327238 8268732
|
||||
3602491835 652291761 28146990
|
||||
3630638825 3122528592 2798646
|
||||
1725486280 3012647256 109881336
|
||||
3232765106 192460045 36910273
|
||||
4042213712 4061491177 216624499
|
||||
2311667486 3256361891 131181828
|
||||
2849273982 3133595970 102505596
|
||||
1365732141 2963642997 49004259
|
||||
3093408594 3494080959 139356512
|
||||
3385585792 1971267128 216906043
|
||||
2954083526 56695294 82629774
|
||||
1331025381 2483732118 34706760
|
||||
3322810356 2450074234 33657884
|
||||
3269675379 139325068 53134977
|
||||
2704750377 680438751 144523605
|
||||
1977777776 824962356 50142191
|
||||
929469914 3236101566 20260325
|
||||
0 1363064706 224603332
|
||||
1835367616 2800384017 134141428
|
||||
647524775 2518438878 281945139
|
||||
2951779578 1587668038 2303948
|
||||
1414736400 1052314826 310749880
|
||||
224603332 229370318 422921443
|
||||
3036713300 0 56695294
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user