You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we write latest finalized state via setLatestFinalizedState, we serialize the entire state in memory. The state is growing and we should start trying to avoid it.
java.util.concurrent.CompletionException: java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.concurrent.CompletableFuture.reportJoin(CompletableFuture.java:413)
at java.base/java.util.concurrent.CompletableFuture.join(CompletableFuture.java:2118)
at tech.pegasys.teku.storage.server.RetryingStorageUpdateChannel.retry(RetryingStorageUpdateChannel.java:133)
at tech.pegasys.teku.storage.server.RetryingStorageUpdateChannel.onStorageUpdate(RetryingStorageUpdateChannel.java:86)
at tech.pegasys.teku.storage.server.CombinedStorageChannelSplitter.onStorageUpdate(CombinedStorageChannelSplitter.java:68)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
at java.base/java.lang.reflect.Method.invoke(Method.java:580)
at tech.pegasys.teku.infrastructure.events.DirectEventDeliverer.executeMethod(DirectEventDeliverer.java:74)
at tech.pegasys.teku.infrastructure.events.DirectEventDeliverer.deliverToWithResponse(DirectEventDeliverer.java:67)
at tech.pegasys.teku.infrastructure.events.AsyncEventDeliverer.lambda$deliverToWithResponse$1(AsyncEventDeliverer.java:80)
at tech.pegasys.teku.infrastructure.events.AsyncEventDeliverer$QueueReader.deliverNextEvent(AsyncEventDeliverer.java:125)
at tech.pegasys.teku.infrastructure.events.AsyncEventDeliverer$QueueReader.run(AsyncEventDeliverer.java:116)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: java.lang.OutOfMemoryError: Java heap space
at tech.pegasys.teku.infrastructure.ssz.sos.SszByteArrayWriter.<init>(SszByteArrayWriter.java:23)
at tech.pegasys.teku.infrastructure.ssz.schema.SszType.sszSerializeTree(SszType.java:64)
at tech.pegasys.teku.infrastructure.ssz.SszData.sszSerialize(SszData.java:58)
at tech.pegasys.teku.storage.server.kvstore.serialization.BeaconStateSerializer.serialize(BeaconStateSerializer.java:36)
at tech.pegasys.teku.storage.server.kvstore.serialization.BeaconStateSerializer.serialize(BeaconStateSerializer.java:21)
at tech.pegasys.teku.storage.server.leveldb.LevelDbTransaction.put(LevelDbTransaction.java:50)
at tech.pegasys.teku.storage.server.kvstore.dataaccess.CombinedKvStoreDao$V4CombinedUpdater.setLatestFinalizedState(CombinedKvStoreDao.java:576)
at tech.pegasys.teku.storage.server.kvstore.KvStoreDatabase$$Lambda/0x00007c47949fe8b0.accept(Unknown Source)
at java.base/java.util.Optional.ifPresent(Optional.java:178)
at tech.pegasys.teku.storage.server.kvstore.KvStoreDatabase.doUpdate(KvStoreDatabase.java:1127)
at tech.pegasys.teku.storage.server.kvstore.KvStoreDatabase.update(KvStoreDatabase.java:646)
current mainnet mem reuirement is 246MB
The unterlining DB system (levelDB at least) doesn't support streaming data into DB.
An approach would be to implement a KvStoreVariable that supports having variable stored in chunks, in a similar way to how we deal with "tables" in level db.
so we could have chunks of (ie 25 MBs) and leverage the int sszSerialize(final SszWriter writer) interface and implementing a chunking KvStoreSerializer and BeaconStateSerializer
The text was updated successfully, but these errors were encountered:
When we write latest finalized state via
setLatestFinalizedState
, we serialize the entire state in memory. The state is growing and we should start trying to avoid it.current mainnet mem reuirement is 246MB
The unterlining DB system (levelDB at least) doesn't support streaming data into DB.
An approach would be to implement a
KvStoreVariable
that supports having variable stored in chunks, in a similar way to how we deal with "tables" in level db.an example: https://github.com/andris9/level-stream-access/blob/master/lib/level-stream-access.js
so we could have chunks of (ie 25 MBs) and leverage the
int sszSerialize(final SszWriter writer)
interface and implementing a chunkingKvStoreSerializer
andBeaconStateSerializer
The text was updated successfully, but these errors were encountered: