279 points by david927 1 day ago | 846 comments
tetris11 15 hours ago
Take photos of the tree from 6 different angles, feed into a 3D model generator, erode the model and generate a 3D graph representation of the tree.
The tool suggests which cuts to make and where, given a restricted fall path (e.g. constrained by a neighbors yard on one side).
I create the fallen branches in their final state along the fall plane, and create individual correction vectors mapping them back to their original state, but in an order that does not intersect other branch vectors.
The idea came to me as a particularly difficult tree needed to come down in my friends yard, and we spent hours planning it out. I've already gotten some interest from the tree-surgeon community, I just need to appify it.
Second rendition will treat the problem more as a physics one than a graph one, with some energy-minimisation methods for solving.
vintagedave 14 hours ago
chaosharmonic 8 hours ago
It's turning into various DIY rabbit holes, actually, with the next one (outside of various related landscaping stuff) being to gut a basement.
firesteelrain 3 hours ago
defterGoose 8 hours ago
pbhjpbhj 6 hours ago
toss1 3 hours ago
It is a very simple three-pass plan: "Deadwood, Crossovers, Aesthetics".
So, first pass, go through the tree cutting out only and all the dead branches. Cut back to live stock, and as always make good clean angle cuts at a proper angle (many horticulture books will provide far better instructions on this).
Second pass, look only for branches that cross-over other branches and especially those that show rubbing or friction marks against other branches. Cut the ones that are either least healthy or grow in the craziest direction (i.e., crazy away from the normal more-or-less not radially away from the trunk).
Then, and only after the other two passes are complete, start pruning for the desired look and/or size & shape for planned growth or bearing fruit.
This method is simple and saves a LOT of ruined trees from trying to first cut to size and appearance, then by the time the deadwood and crossovers are taken later, it is a scraggly mess that takes years to grow back. And it even works well for novices, as long as they pay attention.
I'd suspect entering the state and direction of every branch to an app would take longer than just pruning with the above method, although for trees that haven't fully leafed out, perhaps a 360° angle set of drone pics could make an adequate 3D model to use for planning?
In any case, good luck with your fruit trees — may they grow healthy and provide you with great bounty for many years!
ddahlen 10 hours ago
https://github.com/dahlend/kete
It can predict the location of the entire catalog of known asteroids to generally within the uncertainty of our knowledge of the orbits. Its core is written in rust, with a python frontend.
jxjnskkzxxhx 9 hours ago
It sounds really impressive.
ddahlen 9 hours ago
dang 7 hours ago
(If so, email hn@ycombinator.com and we'll put it in the second-chance pool (https://news.ycombinator.com/item?id=26998308), so it will get a random placement on HN's front page.)
ddahlen 5 hours ago
dang 5 hours ago
More importantly, good luck with the PhD and we all hope it goes swimmingly!
Intralexical 7 hours ago
Would it be appropriate to communicate on the README which telescopes this is used for? You see these very niche, very professional-looking repositories on GitHub now and then, and it's never clear how much credibility they have and whether they come from a hobbyist, student, experiment, or are in operational use.
juxtaposicion 12 hours ago
On top of that, it uses a lightweight AI model to read product descriptions and filter based on things like ingredients (e.g., flagging peanut butter with BPA by checking every photograph of the plastic or avoiding palm oil by reading the nutrition facts) or brand lists (e.g., only showing WSAVA-compliant dog foods). Still reviewing results manually to catch bad extractions.
Started this to replace a spreadsheet I was keeping for bulk purchases. Slowly adding more automation like alerting on price drops or restocking when under a threshold.
abdullahkhalids 2 hours ago
However, what I would like is a product where I upload my shopping receipt for a few weeks/months from the one store I go to. The application figures out what I typically buy and then compares the 4-5 big stores and tells me which one I should go to for least price.
juxtaposicion 36 minutes ago
Uploading a receipt to see how much you can save... that's a good idea. I think I can find your email via your personal site. Can I email you when we have a prototype ready?
abdullahkhalids 26 minutes ago
However, I am in Canada. So can only test it once you expand there. Thanks.
I don't know how things are in the US, but it does seem like the grocery store oligopoly is squeezing consumers a lot, so tools like this are valuable for injecting competition into the system.
KerryJones 12 hours ago
And/or also curious if there is a way to enter in a list of items I want and for it to calculate which store - in aggregate - is the cheapest.
For instance, people often tell me Costco is much cheaper than alternatives, and for me to compare I have to compile my shopping cart in multiple stores to compare.
mynameisash 5 hours ago
A few years ago, I was very diligently tracking _all_ my family's grocery purchases. I kept every receipt, entered it into a spreadsheet, added categories (eg, dairy, meat), and calculated a normalized cost per unit (eg, $/gallon for milk, $/dozen eggs).
I learned a lot from that, and I think I saved our family a decent amount of money, but man it was a lot of work.
juxtaposicion 5 hours ago
@mynameisash I'm curious what you learned... maybe I can help more people learn that using Popgot data.
mynameisash 4 hours ago
I just dusted off my spreadsheet, and it's not as complete as I'd like it to be. I didn't normalize everything but did have many of the staples like milk and eggs normalized; some products had multiple units (eg, "bananas - each" vs "bananas - pound"); and a lot of my comparisons were done based on the store (eg, I was often comparing "Potatoes - 20#" at Costco but "Potatoes - 5#" at Target over time).
Anyway, Costco didn't always win, but in my experience, they frequently did -- $5 peanut butter @ Costco vs $7.74 @ Target based on whatever size and brand I got, which is interesting because Costco doesn't have "generic" PB, whereas Target has much cheaper Market Pantry, and I tried to opt for that.
juxtaposicion 5 hours ago
We have historical price tracking in the database, but haven't exposed it as a product yet. What do you have in mind / what would you use it for?
mynameisash 4 hours ago
juxtaposicion 3 hours ago
So yeah, we'll add it. If you shoot me an email (or post it here?) to chris @ <our site>.com I'll send you a link when it's done. Should take a day or two.
unvalley 10 hours ago
xarici_ishler 1 day ago
You can search for full or partial rows and see the whole query lineage – which intermediate rows from which CTEs/subqueries contributed to the result you're searching for.
Entirely offline & no usage of AI. Free in-browser version (using PGLite WASM), paid desktop version.
No website yet, here's a 5 minute showcase (skip to middle): https://www.loom.com/share/c03b57fa61fc4c509b1e2134e53b70dd
benjaminsky2 11 hours ago
I would recommend you target data warehouses like snowflake and bigquery where the query complexity and thus value prop for a tool like this is potentially much higher.
xarici_ishler 11 hours ago
I can ping you via email when the debugger is ready, if you're interested. My email is in my profile
parrit 20 hours ago
thenaturalist 5 hours ago
Even if not for DuckDB, you can use this to validate/ parse queries possibly.
jeffhuys 9 hours ago
thebytefairy 30 minutes ago
xarici_ishler 8 hours ago