[ACMcoder] A + B Problem II

简介: Problem Description I have a very simple problem for you. Given two integers A and B, your job is to calculate the Sum of A + B.Input The first line of the input contains an integer T(1

Problem Description
I have a very simple problem for you. Given two integers A and B, your job is to calculate the Sum of A + B.

Input
The first line of the input contains an integer T(1<=T<=20) which means the number of test cases. Then T lines follow, each line consists of two positive integers, A and B. Notice that the integers are very large, that means you should not process them by using 32-bit integer. You may assume the length of each integer will not exceed 1000.

Output
For each test case, you should output two lines. The first line is “Case #:”, # means the number of the test case. The second line is the an equation “A + B = Sum”, Sum means the result of A + B. Note there are some spaces int the equation. Output a blank line between two test cases.

Sample Input

2
1 2
112233445566778899 998877665544332211

Sample Output

Case 1:
1 + 2 = 3

Case 2:
112233445566778899 + 998877665544332211 = 1111111111111111110

解题思路

思路很简单,注意进位即可。

解题代码

//因为部分细节搞了半天:
//使用max要引用algorithm
//注意最后一个sample不能输出空行
#include <iostream>
#include <algorithm>
using namespace std;

int main()
{
    char n1[1001] = {0};
    char n2[1001] = {0};
    int total;
    //cin>>total;
    scanf("%d", &total);
    int outputindex = 0;
    while (outputindex < total)
    {
        //cin>>n1>>n2;
        scanf("%s %s", n1, n2);
        int len1 = 0;
        int len2 = 0;
        while ('\0' != n1[len1])
            len1++;
        while ('\0' != n2[len2])
            len2++;

        char n[1002] = {0};
        int index1 = len1 - 1;
        int index2 = len2 - 1;
        int index = max(index1, index2) + 1;
        char bit = 0;
        int num = 0;
        while (index1 >= 0 && index2 >= 0)
        {
            num = n1[index1] + n2[index2] - 48 + bit;
            if (num > 57)
            {
                n[index] = num - 10;
                bit = 1;
            }
            else
            {
                n[index] = num;
                bit = 0;
            }
            --index1;
            --index2;
            --index;
        }

        while (index1 >= 0)
        {
            num = n1[index1] + bit;
            if (num > 57)
            {
                n[index] = num - 10;
                bit = 1;
            }
            else
            {
                n[index] = num;
                bit = 0;
            }
            --index1;
            --index;
        }

        while (index2 >= 0)
        {
            num = n2[index2] + bit;
            if (num > 57)
            {
                n[index] = num - 10;
                bit = 1;
            }
            else
            {
                n[index] = num;
                bit = 0;
            }
            --index2;
            --index;
        }
        if (bit != 0)
        {
            n[index] = bit + '0';
        }

        char *p = n;
        while ('\0' == *p)
            p++;
        ++outputindex;
        cout<<"Case "<<outputindex<<":"<<endl;
        cout<<n1<<" + "<<n2<<" = "<<p<<endl;

        if (outputindex != total)
            cout<<endl;
    }
    return 0;
}
目录
相关文章
|
6月前
|
数据挖掘
Divisibility Problem
Divisibility Problem
151 0
|
机器学习/深度学习
|
Java
|
人工智能 Java BI
|
数据库管理