You are given a permutation of size , i.e. a sequence of distinct numbers. The task is to partition this permutation into monotonic subsequences. The number of subsequences (syn.: partition) does not need to be minimum, but it has to be smaller than , which denotes the minimum such that any permutation of size can be splited into at most subsequences.
I came across this problem on Codeforces a while ago. They did provide a solution (thanks, Radewoosh!), but since I love to proving things, I want to note down some of my insights and findings.
Observations
Let be the greatest such that . Consider the permutation (size ) in which the first terms are 1, 3, 2, 6, 5, 4, 10, 9, 8, 7, ... (OEIS A038722), followed by the rest of terms sorted decreasingly. We call this sequence . For example:
What is the minimum number of monotonic partitions of ?
We notice that can be seperated into several contiguous subarray, each subarray has decreasing elements. For example:
The number of such subarrays (denoted as ) is either or , depends on whether or not. What is the minimum number of partitions of (denoted as )? We can easily tell that every element in the subarray is less than any element in the subarray , if is on the left of . We will use a greedy algorithm to compute . Let's say is the number of increasing / decreasing subsequences.
- . If two elements are in the same subarray, they must belong to two different increasing subsequences. The biggest subarray size is , and each element in this subarray belong to a distinct increasing subsequence, hence .
- . If we have a decreasing subsequence , we have to expand it to the whole subarray. The action of expanding means we take some elments from other subsequences and bring them into . This cannot add more subsequences, but can even remove some of them. So it's legit to expand. If we have chosen subarrays to be decreasing subsequences, will be the max size of the other subarrays. So, those decreasing subsequences should be the biggest subarrays of to reduce . For example, the sizes of the subarrays in is: 1, 2, 2, 3 (); ; . With this in mind:
Which basically means the first method always yields the minimal . The second method is as effective, sometimes worse, but never better than the first method.
Conclusion: .
What is the relation between and ?
It's easy to see that:
I cannot tell the exact value of , but I can use as the new upper bound instead of . If we can partition any permutation into at most subsequences, since we're guaranteed that , such a solution will be immediately valid. We're close to solving the problem.
How to partition a permutation of into at most subsequences?
Let me remind you that is the greatest such that .
We can make use of the classic problem "Longest Increasing Sequence" (or LIS). The algorithm is as follows:
- If , we have nothing to do.
- If , then we can immediately split the permutation into decreasing subsequences. (See more: Patience Sort). There, we did it!
- Otherwise, we erase exactly elements from the LIS, use them as a subsequence, leaving elements, and is the largest such that . Back to step 1, with updated parameters: and .
The partition strategy is very elegant, though the proof and observation leading to the algorithm are so sophisticated.
See my implementation in C++. Special thanks to Meteusz Radecki, the author of this problem, I really enjoy it.