|
||||||||||
equationTime Limit: 2000/1000 MS (Java/Others) Memory Limit: 262144/262144 K (Java/Others)Total Submission(s): 2537 Accepted Submission(s): 687 Problem Description You are given two integers $N, C$ and two integer sequences $a$ and $b$ of length $N$. The sequences are indexed from $1$ to $N$. Please solve the following equation for $x$: $\sum\limits_{i=1}^{N}|a_i \cdot x + b_i |=C$, where $|v|$ means the absolute value of $v$. Input The first line contains an integer $T$ indicating there are $T$ tests. Each test consists of $N+1$ lines. The first line contains two integers $N, C$. The $i$-th line of following $N$ lines consists of two integers $a_i, b_i$. * $1 \le T \le 50$ * $1 \le N \le 10^5$ * $1 \le a_i \le 1000$ * $-1000 \le b_i \le 1000$ * $1 \le C \le 10^9$ * only $5$ tests with $N$ larger than $1000$ Output For each test, output one line. If there are an infinite number of solutions, this line consists only one integer $-1$. Otherwise, this line first comes with an integer $m$ indicating the number of solutions, then you must print $m$ fractions from the smallest to the largest indicating all possible answers. (It can be proved that all solutions can be written as fractions). The fraction should be in the form of "a/b" where a must be an integer, b must be a positive integer, and $gcd(abs(a),b)=1$. If the answer is $0$, you should output "0/1". Sample Input
Sample Output
Source | ||||||||||
|