remix logo

Hacker Remix

Ask HN: What are you working on? (April 2025)

279 points by david927 1 day ago | 846 comments

What are you working on? Any new ideas that you're thinking about?

tetris11 15 hours ago

A tree cutting tool.

Take photos of the tree from 6 different angles, feed into a 3D model generator, erode the model and generate a 3D graph representation of the tree.

The tool suggests which cuts to make and where, given a restricted fall path (e.g. constrained by a neighbors yard on one side).

I create the fallen branches in their final state along the fall plane, and create individual correction vectors mapping them back to their original state, but in an order that does not intersect other branch vectors.

The idea came to me as a particularly difficult tree needed to come down in my friends yard, and we spent hours planning it out. I've already gotten some interest from the tree-surgeon community, I just need to appify it.

Second rendition will treat the problem more as a physics one than a graph one, with some energy-minimisation methods for solving.

vintagedave 14 hours ago

This is the kind of thing that makes me love HN. An idea I would never have thought of, with an immediately obvious use in multiple ways (fall path plus ideal lumber cutting?), probably very difficult, yet being tackled with one implementation already... and spoken of quite humbly.

chaosharmonic 8 hours ago

Funny, one of mine also involves trees -- but is mostly outdoor cleanup. The kind that involves decades' worth of it, thanks to what I'll just say is a lot of maintenance that wasn't done over a long time. There's an extensive amount of brush, leaves, etc of varying ages that could maybe be shredded up into something useful, invasive vines I'm still trying to deal with, and more old trash than I've fully figured out how to properly dispose of.

It's turning into various DIY rabbit holes, actually, with the next one (outside of various related landscaping stuff) being to gut a basement.

firesteelrain 3 hours ago

You should branch out (hehe) into flower and plant pruning suggestions with your app. ChatGPT can do this now if prompted.

defterGoose 8 hours ago

I would love to have such a model tell me how to prune my fruit trees as they grow up. Should be a fairly straightforward supervised problem with the right front end for the graph generation.

pbhjpbhj 6 hours ago

When i read OP this is what I thought it was going to be - these branches are going to be apex competitors, these are crossing or going to cross, this one shows signs of disease, this one interrupts air flow through the centre, etc.

toss1 3 hours ago

You can start right now with an algorithm I learned from an expert when I was working in a landscaping business.

It is a very simple three-pass plan: "Deadwood, Crossovers, Aesthetics".

So, first pass, go through the tree cutting out only and all the dead branches. Cut back to live stock, and as always make good clean angle cuts at a proper angle (many horticulture books will provide far better instructions on this).

Second pass, look only for branches that cross-over other branches and especially those that show rubbing or friction marks against other branches. Cut the ones that are either least healthy or grow in the craziest direction (i.e., crazy away from the normal more-or-less not radially away from the trunk).

Then, and only after the other two passes are complete, start pruning for the desired look and/or size & shape for planned growth or bearing fruit.

This method is simple and saves a LOT of ruined trees from trying to first cut to size and appearance, then by the time the deadwood and crossovers are taken later, it is a scraggly mess that takes years to grow back. And it even works well for novices, as long as they pay attention.

I'd suspect entering the state and direction of every branch to an app would take longer than just pruning with the above method, although for trees that haven't fully leafed out, perhaps a 360° angle set of drone pics could make an adequate 3D model to use for planning?

In any case, good luck with your fruit trees — may they grow healthy and provide you with great bounty for many years!

ddahlen 10 hours ago

I am about to begin a PhD in astronomy. Until last month I was working at Caltech for 3 years on code which calculates orbits of asteroids to high precision. This code is being used on several NASA telescopes now to predict when they will image known asteroids (NEO Surveyor, SphereX, maybe Roman eventually). I was allowed to open source it and I am planning on making it the basis of my PhD research:

https://github.com/dahlend/kete

It can predict the location of the entire catalog of known asteroids to generally within the uncertainty of our knowledge of the orbits. Its core is written in rust, with a python frontend.

jxjnskkzxxhx 9 hours ago

Ever thought of making a presentation about this subject and putting it on YouTube? :-)

It sounds really impressive.

ddahlen 9 hours ago

I've never really dabbled in youtube. I have several projects/papers I am working on using this code, I have thought about writing some blog posts as I publish those. But a PhD is going to be a major time sink, we will see what happens.

dang 7 hours ago

Do you want to post it as a Show HN soon, before the PhD sucks you in altogether?

(If so, email hn@ycombinator.com and we'll put it in the second-chance pool (https://news.ycombinator.com/item?id=26998308), so it will get a random placement on HN's front page.)

ddahlen 5 hours ago

Thank you for the offer! Unfortunately the PhD has already sunk its claws in. I should have some flashy stuff to show off in a 2-3 months (I have a conference talk coming I have to prepare material for).

dang 5 hours ago

Drat! Well, if you ever have some cycles to share your work on HN, contact us at hn@ycombinator.com and we'll be happy to help.

More importantly, good luck with the PhD and we all hope it goes swimmingly!

Intralexical 7 hours ago

I like the daredevil asteroids going for the close dive of the star emoji sun :)

Would it be appropriate to communicate on the README which telescopes this is used for? You see these very niche, very professional-looking repositories on GitHub now and then, and it's never clear how much credibility they have and whether they come from a hobbyist, student, experiment, or are in operational use.

juxtaposicion 12 hours ago

I’m working on Popgot (https://popgot.com), a tool that tracks unit prices (cost per ounce, sheet, pound) across Costco, Walmart, Target, and Amazon. It normalizes confusing listings (“family size”, “mega pack”, etc.) to surface the actual cheapest option for daily essentials.

On top of that, it uses a lightweight AI model to read product descriptions and filter based on things like ingredients (e.g., flagging peanut butter with BPA by checking every photograph of the plastic or avoiding palm oil by reading the nutrition facts) or brand lists (e.g., only showing WSAVA-compliant dog foods). Still reviewing results manually to catch bad extractions.

Started this to replace a spreadsheet I was keeping for bulk purchases. Slowly adding more automation like alerting on price drops or restocking when under a threshold.

abdullahkhalids 2 hours ago

I don't think I have the time to go to different stores to buy different things based on what is cheap. I have one fixed one.

However, what I would like is a product where I upload my shopping receipt for a few weeks/months from the one store I go to. The application figures out what I typically buy and then compares the 4-5 big stores and tells me which one I should go to for least price.

juxtaposicion 36 minutes ago

Yeah, I agree. It is a pain to search product by product instead of sticking to one store. Also popgot.com can only do what's online & shipped to you -- so really just the non-perishables / daily essentials that are not fresh groceries. But even when limited to consumables I save ~$100/mo by basically buying by unit price.

Uploading a receipt to see how much you can save... that's a good idea. I think I can find your email via your personal site. Can I email you when we have a prototype ready?

abdullahkhalids 26 minutes ago

A one time email is fine.

However, I am in Canada. So can only test it once you expand there. Thanks.

I don't know how things are in the US, but it does seem like the grocery store oligopoly is squeezing consumers a lot, so tools like this are valuable for injecting competition into the system.

KerryJones 12 hours ago

I like this idea a lot -- feels like there's a lot of room to grow here. Do you have any sort of historical price tracking/alerting?

And/or also curious if there is a way to enter in a list of items I want and for it to calculate which store - in aggregate - is the cheapest.

For instance, people often tell me Costco is much cheaper than alternatives, and for me to compare I have to compile my shopping cart in multiple stores to compare.

mynameisash 5 hours ago

> For instance, people often tell me Costco is much cheaper than alternatives, and for me to compare I have to compile my shopping cart in multiple stores to compare.

A few years ago, I was very diligently tracking _all_ my family's grocery purchases. I kept every receipt, entered it into a spreadsheet, added categories (eg, dairy, meat), and calculated a normalized cost per unit (eg, $/gallon for milk, $/dozen eggs).

I learned a lot from that, and I think I saved our family a decent amount of money, but man it was a lot of work.

juxtaposicion 5 hours ago

Glad you guys mentioned Costco -- I happen to have written a blog post on exactly that: https://popgot.com/blog/retailer-comparison Surprisingly, Costco does not win most of the time, and especially if you are not brand loyal. Costco has famously low-margins, but it turns out that when you sort by price-per-unit they're ok, but not great.

@mynameisash I'm curious what you learned... maybe I can help more people learn that using Popgot data.

mynameisash 4 hours ago

One thing to call out is that costco.com and in-person have different offerings (& prices) -- but you probably know that already.

I just dusted off my spreadsheet, and it's not as complete as I'd like it to be. I didn't normalize everything but did have many of the staples like milk and eggs normalized; some products had multiple units (eg, "bananas - each" vs "bananas - pound"); and a lot of my comparisons were done based on the store (eg, I was often comparing "Potatoes - 20#" at Costco but "Potatoes - 5#" at Target over time).

Anyway, Costco didn't always win, but in my experience, they frequently did -- $5 peanut butter @ Costco vs $7.74 @ Target based on whatever size and brand I got, which is interesting because Costco doesn't have "generic" PB, whereas Target has much cheaper Market Pantry, and I tried to opt for that.

juxtaposicion 5 hours ago

I'm so glad you like it!

We have historical price tracking in the database, but haven't exposed it as a product yet. What do you have in mind / what would you use it for?

mynameisash 4 hours ago

I like that you have the ability to exclude on some dimension (eg, I don't use Amazon.com). Do you have or are you considering adding more retailers beyond the four you mentioned? For example, I buy a lot of unroasted coffee from sweetmarias.com, and excluding Amazon from Popgot results eliminates all but one listing (from Walmart).

juxtaposicion 3 hours ago

Ah, hell yeah! My buddy on this project has been itching to add sweetmarias.com ... he just needed this as an excuse.

So yeah, we'll add it. If you shoot me an email (or post it here?) to chris @ <our site>.com I'll send you a link when it's done. Should take a day or two.

unvalley 10 hours ago

Cool! I hope it's coming to Japan (I live) near future.

xarici_ishler 1 day ago

The first ever SQL debugger – runs & visualizes your query step-by-step, every clause, condition, expression, incl. GROUP BY, aggregates / windows, DISTINCT (ON), subqueries (even correlated ones!), CTEs, you name it.

You can search for full or partial rows and see the whole query lineage – which intermediate rows from which CTEs/subqueries contributed to the result you're searching for.

Entirely offline & no usage of AI. Free in-browser version (using PGLite WASM), paid desktop version.

No website yet, here's a 5 minute showcase (skip to middle): https://www.loom.com/share/c03b57fa61fc4c509b1e2134e53b70dd

benjaminsky2 11 hours ago

This is awesome! I’m work with a team of analysts and data engineers who own a pretty big snowflake data warehouse. We write a ton of dbt models and have a range of sql skill levels on the team. This would be the perfect way to allow more junior devs to build their skills quickly and support more complex models.

I would recommend you target data warehouses like snowflake and bigquery where the query complexity and thus value prop for a tool like this is potentially much higher.

xarici_ishler 11 hours ago

Thank you, nice to get some idea validation from folks in the industry. For sure data warehouses are the top priority on my TODO list, I picked PG first because that's what I'm familiar with.

I can ping you via email when the debugger is ready, if you're interested. My email is in my profile

parrit 20 hours ago

Was thinking today... not a debugger but even a SQL progess bar, so I know that my add column will take say 7 hours in advance.

thenaturalist 5 hours ago

Possibly look at https://duckdb.org/community_extensions/extensions/parser_to...

Even if not for DuckDB, you can use this to validate/ parse queries possibly.

jeffhuys 9 hours ago

This would be incredible to understand why some queries execute slow; most of the time it's one of the steps in between that takes 99% of the execution time at our company. Do you record the time each step takes?

thebytefairy 30 minutes ago

Can you not use EXPLAIN ANALYZE to identify steps that had the highest compute time? I think most databases have some form of this.

xarici_ishler 8 hours ago

You're onto the original idea I started out with! Unfortunately it's very difficult to correlate input SQL to an output query plan – but possible. It's definitely in future plans