问题描述
小U和小R喜欢探索二进制数字的奥秘。他们想找到一个方法,将两个二进制字符串相加并以十进制的形式呈现。这个过程需要注意的是,他们的二进制串可能非常长,所以常规的方法可能无法处理大数。小U和小R希望你帮助他们设计一个算法,该算法能在保证时间复杂度不超过O(n^2)的前提下,返回两个二进制字符串的十进制求和结果。
测试样例
样例1:
输入:
binary1 = "101" ,binary2 = "110"
输出:'11'
样例2:
输入:
binary1 = "111111" ,binary2 = "10100"
输出:'83'
样例3:
输入:
binary1 = "111010101001001011" ,binary2 = "100010101001"
输出:'242420'
样例4:
输入:
binary1 = "111010101001011" ,binary2 = "10010101001"
输出:'31220'
样例5:
输入:
binary1 = "11" ,binary2 = "1"
输出:'4'
这道题的难点在于高位数加法有溢出风险,但是实际上用python不需要考虑溢出的问题。
代码:
def convert(binary):
res = 0
for i in range(len(binary)):
res *= 2
if binary[i] == '1':
res += 1
return res
def solution(binary1, binary2):
num1 = convert(binary1)
num2 = convert(binary2)
return str(num1 + num2)
if __name__ == "__main__":
# You can add more test cases here
print(solution("101", "110") == "11")
print(solution("111111", "10100") == "83")
print(solution("111010101001001011", "100010101001") == "242420")
print(solution("111010101001011", "10010101001") == "31220")