我有一个非常大的数字在 JavaScript 中表示为二进制:
var largeNumber = '11010011010110100001010011111010010111011111000010010111000111110011111011111000001100000110000011000001100111010100111010101110100010001011010101110011110000011000001100000110000011001001100000110000011000001100000110000111000011100000110000011000001100000110000011000010101100011001110101101001100110100100000110000011000001100000110001001101011110110010001011010001101011010100011001001110001110010100111011011111010000110001110010101010001111010010000101100001000001100001011000011011111000011110001110111110011111111000100011110110101000101100000110000011000001100000110000011010011101010110101101001111101001010010111101011000011101100110010011001001111101'
当我使用parseInt(largeNumber, 10)
l将其转换为十进制时,它给了我,1.5798770299367407e+199
但是当我尝试将其转换回二进制时:
parseInt(`1.5798770299367407e+199`, 2)
它返回1
(我认为这与parseInt
舍入值的工作方式有关)当我期望看到largeNumber
. 你能解释我这种行为吗?以及如何在 JavaScript 中将其转换回原始状态?
编辑:这个问题是我在尝试存储和传输大量布尔数据的实验的结果。这largeNumber
是[true,true,false,true ...]
必须在客户端、客户端工作器和服务器之间共享的布尔值集合的表示。