packing bigger pieces
don-eng opened this issue · 1 comments
I am using this to help our CNC process of cutting plywood sheets into parts. For my normal use case this is working great. One edge case i have is that sometimes parts are longer than length of sheet and so parts need to be joined. There could be up to 2 joins / 3 sub parts needed for a part.
How would you handle this so it gets nested? The only idea i have is to manually split the part into sub parts at sheet length, sheet length and remainder. Any other better ideas?
Example:
- sheet is 2400mm long
- part is 6000mm
- for this discussion, we can ignore width as it will be specified and part width is way less than sheet width. when nesting i will use the part width.
Hi
If you can ignore the width in some way, it looks to me you have a classic knapsack problem, so perhaps a custom algorithm would be best, especially if it's high volume. As I see it your problem is to find the optimal number of splits and their lengths to minimize the wasted sheet, while keeping the number of splits per part under certain value.
Splitting the parts as you suggest could be problematic using this library, depending on the dimension you could end with a lot of small sheet pieces wasted.
What you could try is to generate a few different split schemas for each part, perhaps with some heuristic that considers the part and sheet lengths, for your example:
1500|1500|1500|1500
1000|2000|2000|1000
2000|2000|2000
...
then pack the parts using each one of the schemas or a mix of them and select the best one. Or if the parts and sheets lengths stay always the same, you could also do an exhaustive search using the method above and select the best overall schema and used it from them on.
Hope this helps