Hadoop异常:n must be positive
测试HDFS 5000万个文件场景时,看到Datanode的log上不断的报如下异常
11/11/10 00:00:00 ERROR datanode.DataNode: DatanodeRegistration(172.17.1.23:50010, storageID=DS-857985192-202.106.199.37-50010-1320820941090, infoPort=8083, ipcPort=50020):DataXceiver java.lang.IllegalArgumentException: n must be positive at java.util.Random.nextInt(Random.java:250) at org.apache.hadoop.hdfs.server.datanode.DataBlockScanner.getNewBlockScanTime(DataBlockScanner.java:284) at org.apache.hadoop.hdfs.server.datanode.DataBlockScanner.addBlock(DataBlockScanner.java:301) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:372) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103) at java.lang.Thread.run(Thread.java:662)
long period = Math.min(scanPeriod, Math.max(blockMap.size(),1) * 600 * 1000L); return System.currentTimeMillis() - scanPeriod + random.nextInt((int)period);
random.nextInt((int)period),替换成
random.nextInt(Math.abs((int)period))即可。