Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MaxRects speedup? #21

Open
samsonsite1 opened this issue Dec 10, 2019 · 11 comments
Open

MaxRects speedup? #21

samsonsite1 opened this issue Dec 10, 2019 · 11 comments

Comments

@samsonsite1
Copy link

Hello,

This one function call, PruneFreeList(), seems to be slowing down the entire MaxRectsBinPack routine, as it's called after every insert. From what I understand, PruneFreeList() runs a containment test on a list of free rectangles. If one rect is inside another, it's erased from the list.

Is there a better way to find all possible containments, and then delete them?
Thanks.

@juj
Copy link
Owner

juj commented Dec 11, 2019

That is at the heart of MaxRects. If you do find a better way, you can certaily write a paper about it! :)

It is possible to get constant factor speedups by e.g. sweep sorting, but nothing I can think of that would fundamentally change the O(n^2) nature of the routine.

A small optimization might be to instead of calling .erase(), to swap the element with the last one in the container, and run a .pop_back() instead.

Another micro-optimization might be to sort the rectangles by their surface area, to avoid having to test both if i is contained in j and if j is contained in i (if i is contained in j, then area(i) <= area(j)). Not sure if that is a win in practice.

A further optimization may be to manage a network/graph of overlapping rectangles, and do a graph search on only those. That may be the most efficient way, but still O(n^2) complexity.

@samsonsite1
Copy link
Author

Thanks for the reply.

Those are some good ideas. Specifically, I'm trying to speed up one particular MaxRects method, BottomLeft. Since the rects are building up from the bottom, is there a way to remove free rects from the list that have no chance of getting placed? Provided that all rects are sorted by area or perimeter before packing. Or would that prevent the algorithm from doing its job?

I'm thinking if I can keep the free list relatively small throughout the entire packing process, then that would speed up everything greatly.

@juj
Copy link
Owner

juj commented Dec 11, 2019

You can certainly remove rectangles from the free list, that will then cause that area to no longer be considered. (if that was the only rectangle that represented the particular space)

The size of the free list does dominate the runtime, so the fewer free list items there are, the faster. Probably the first approach I'd try would be to maintain sets of disjoint free rectangles (a vector/list of vectors of rectangles), where each vector of rectangles would represent an "island" of rectangles that overlap (or overlap transitively). Then the free list decomposes to separate islands/sets of disjoint rectangles, and only each island individually will need to be processed in O(n^2) time. Perhaps not quite as efficient as a graph search, but probably very similar.

@samsonsite1
Copy link
Author

Great, thanks for your suggestions.

@samsonsite1
Copy link
Author

samsonsite1 commented Dec 13, 2019

Just wondering, do graphs handle cases of merging better?

If I were to maintain an overlap list by insertion, I see a situation where two or more lists could possibly overlap themselves. So, I think I would need to throw in a merge routine after the insertion. The cost of maintaining this list might be more than its worth.

@juj
Copy link
Owner

juj commented Dec 14, 2019

I do not have an answer for that, I have never implemented the code to measure. Extra bookkeeping may certainly outweigh the benefit, and the performance characteristics will likely greatly vary on what sizes of rectangles one packs in the bin as well.

@LiangJiSama
Copy link

I'm wondering Is there a need to test each pair of free rectangles.
Since the old free rectangles has been tested with each other, maybe only newly generated free rectangles have chances to be contained or contained other free rectangles. Is there any situation I might have missed? Or did I misunderstand this part of algorithm?

juj added a commit that referenced this issue Jul 8, 2021
juj added a commit that referenced this issue Jul 8, 2021
@juj
Copy link
Owner

juj commented Jul 8, 2021

That is a good point, and certainly true.

When a new rectangle is inserted, and tested against all free rectangles, the set of new free rectangles that is formed needs to be first tested internally to remove all duplicates (all pairs of new created free rectangles). Then each of those remaining rectangles need to be tested against the existing/old free rectangles. But the existing free rectangles do not need to be tested against the other existing free rectangles.

It has been a close to a decade since I wrote the initial implementation, and I vaguely recall that I did not implement it that way likely due to code simplicity due to the fact that the above process does not lend to an improvement in Big-O/Theta analysis - there will be no guarantees to how many or few of these new rectangles would form.

In practice it can be an observable constant factor speedup though. Pushed that kind of implementation in this repo now - I had something like that implemented from before, along with a few other micro-optimizations. That does unfortunately diverge the code from the results of the original paper, which I hesitated to change for a while - but the paper is quite old now anyways, so probably not that important anymore.

@LiangJiSama
Copy link

Thanks,

I've been doing some works for this problem for a while, I'm wondering if there is any further improvement for the final result like the fill rate of single bin or amount of bins for maxrects. Or is there any other better algorithm for this problem? I've found some improvement for the other algorithms like skyline. But it seems that it has come to an end for maxrects since it is an simplified version of solution of the 3d container loading problem and little discussion about this algorithm can be found from the later papers.

I'm dealing with the problem to pack items into single bin and ensure the fill rate of this bin to be as high as possible. Maxrects seems to be the best solution in practice as there is a need for speed so meta-heuristics seems unacceptable. I've seen some improvement for the skyline-best-fit about its scoring function. Don't know if it can be better than maxrects in practice.

@juj
Copy link
Owner

juj commented Jul 9, 2021

I've been doing some works for this problem for a while, I'm wondering if there is any further improvement for the final result like the fill rate of single bin or amount of bins for maxrects. Or is there any other better algorithm for this problem?

Unfortunately I don't know of better improvements to the algorithms.

is any further improvement for the final result like the fill rate
there is a need for speed so meta-heuristics seems unacceptable

These are two conflicting goals, improving both speed and packing efficiency at the same time will likely require some clever new techniques.

If you'd like to make MaxRects faster, something to try could be to avoid considering all free rectangles, and e.g. try only a constant number of them (maybe randomly selected?), and select the best from those. That would speed up the algorithm at the expense of packing performance.

If you'd like to get better fill rate and the packing does not need to be fully online, a semi-online algorithm might work, where keeping a window of last few placed rectangles, and always sorting them to find the best ordering of the rectangles to pack. That would also be a constant factor overhead, as opposed to the slowdown coming from full offline input sort.

Maybe something that can improve fill rate is if you know in advance the input distribution of the rectangles to pack. E.g. if you know that the minimum side length of any input rectangle will be >= k, then you could have a fast heuristic that penalizes placements that leave free rectangle strips that have a side length < k, since those will be statically known to be wasted areas that no input rectangle will be able to fit in anymore.

@LiangJiSama
Copy link

I think fill rate might be the first one to consider in my condition. Thanks for your great ideas anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants