|
||||||||||
AssignmentTime Limit: 4000/2000 MS (Java/Others) Memory Limit: 524288/524288 K (Java/Others)Total Submission(s): 56 Accepted Submission(s): 31 Problem Description You are given two sequences $a,b$ of length $n$ and a cost matrix $A$ of size $n \times n$. **The matrix $A$ satisfies $\boldsymbol{A_{i,j} \ge A_{i,j-1}}$ for all $1 \le i \le n,1 < j \le n$**. You can do the following operation arbitrary number of times: - Select three integers $l,r,x$ satisfying $1\le l\le r\le n$ and $1\le x\le n$, then assign $x$ to $a_i$ for all indices $i$ between $l$ and $r$, inclusive. The cost of this operation is $A_{x,r-l+1}$. For all $i \in [0,k]$, find the minimum sum of costs to make $a$ has **at most** $i$ positions differing from $b$. Input The first line contains a single integer $T$ ($1\le T\le 10$), denoting the number of test cases. For each test case, the first line contains two integers $n,k$ ($1\le n\le 100$, $1\le k\le 10$). The second line contains $n$ integers $a_1,a_2,\cdots,a_n$ ($1 \le a_i \le n$), denoting the sequence $a$. The third line contains $n$ integers $b_1,b_2,\cdots,b_n$ ($1 \le b_i \le n$), denoting the sequence $b$. The next $n$ lines, each contains $n$ integers. The $j$-th integer in the $i$-th line denotes $A_{i,j}$ ($1 \le A_{i,j} \le 10^6$). It is guaranteed that for all $1\le i\le n$, $1< j\le n$, $A_{i,j}\ge A_{i,j-1}$. Output For each test case, output one line with $k + 1$ integers denoting the answer. Sample Input
Sample Output
Source | ||||||||||
|