|
||||||||||
MatrixTime Limit: 4000/2000 MS (Java/Others) Memory Limit: 65536/65536 K (Java/Others)Total Submission(s): 2182 Accepted Submission(s): 625 Problem Description Let A be a 1*N matrix, and each element of A is either 0 or 1. You are to find such A that maximize D=(A*B-C)*AT, where B is a given N*N matrix whose elements are non-negative, C is a given 1*N matrix whose elements are also non-negative, and AT is the transposition of A (i.e. a N*1 matrix). Input The first line contains the number of test cases T, followed by T test cases. For each case, the first line contains an integer N (1<=N<=1000). The next N lines, each of which contains N integers, illustrating the matrix B. The jth integer on the ith line is B[i][j]. Then one line followed, containing N integers, describing the matrix C, the ith one for C[i]. You may assume that sum{B[i][j]} < 2^31, and sum{C[i]} < 2^31. Output For each case, output the the maximum D you may get. Sample Input
Sample Output
Hint For sample, A=[1, 1, 0] or A=[1, 1, 1] would get the maximum D. Author BUPT Source | ||||||||||
|