The Beginning: start_kernel()

本文深入解析Start_kernel过程中的所有函数调用,为读者提供了一个针对特定内核版本的全面分析。
--------- beginning of crash 06-15 04:08:26.793 996 996 F libc : Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 996 (init), pid 996 (init) 06-15 04:08:26.809 996 996 F libc : crash_dump helper failed to exec, or was killed 06-15 04:08:46.925 5241 5241 F libc : Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 5241 (init), pid 5241 (init) 06-15 04:08:47.009 5241 5241 F libc : crash_dump helper failed to exec, or was killed 06-15 04:19:49.126 20112 30927 F libc : Fatal signal 6 (SIGABRT), code -6 (SI_TKILL) in tid 30927 (Signal Catcher), pid 20112 (m.obric.camera2) 06-15 04:19:50.637 430 430 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 06-15 04:19:50.637 430 430 F DEBUG : Build fingerprint: 'unknown/pacific/pacific:15/0.8.0.0/94:user/test-keys' 06-15 04:19:50.637 430 430 F DEBUG : Revision: '0' 06-15 04:19:50.637 430 430 F DEBUG : ABI: 'arm64' 06-15 04:19:50.637 430 430 F DEBUG : Timestamp: 2025-06-15 04:19:49.413255468+0800 06-15 04:19:50.637 430 430 F DEBUG : Process uptime: 396s 06-15 04:19:50.637 430 430 F DEBUG : Cmdline: com.obric.camera2 06-15 04:19:50.637 430 430 F DEBUG : pid: 20112, tid: 30927, name: Signal Catcher >>> com.obric.camera2 <<< 06-15 04:19:50.637 430 430 F DEBUG : uid: 10057 06-15 04:19:50.637 430 430 F DEBUG : tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE) 06-15 04:19:50.637 430 430 F DEBUG : pac_enabled_keys: 000000000000000f (PR_PAC_APIAKEY, PR_PAC_APIBKEY, PR_PAC_APDAKEY, PR_PAC_APDBKEY) 06-15 04:19:50.637 430 430 F DEBUG : signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr -------- 06-15 04:19:50.637 430 430 F DEBUG : Abort message: 'Caused HeapTaskDaemon failure : SuspendAll timeout: Unsuspended threads: Thread[2,tid=30927,Runnable,Thread*=0x717887c630,peer=0x158c02d0,"Signal Catcher"], Info for Thread[2,tid=30927,Runnable,Thread*=0x717887c630,peer=0x158c02d0,"Signal Catcher"]:Signal Catcher tid: 30927, state&flags: 0x9, priority: 10, barrier value: 1, Target states: [30927 (Signal Catcher) S 1585 1585 0 0 -1 4194368 53581 0 210 0 455 116 0 0 0 -20 120 0 6, 30927 (Signal Catcher) S 1585 1585 0 0 -1 4194368 53581 0 210 0 455 116 0 0 0 -20 120 0 6]1@481294200252 Final wait time: 2.023s' 06-15 04:19:50.637 430 430 F DEBUG : x0 fffffffffffffffc x1 0000000000000089 x2 0000000000000010 x3 00000070108fdd18 06-15 04:19:50.637 430 430 F DEBUG : x4 0000000000000000 x5 00000000ffffffff x6 00000000ffffffff x7 7365786574756d20 06-15 04:19:50.637 430 430 F DEBUG : x8 0000000000000062 x9 590991cad317e6aa x10 ffffffffffd13893 x11 0000000033e148f4 06-15 04:19:50.637 430 430 F DEBUG : x12 00000000684dd964 x13 000000007fffffff x14 0000000000052a7a x15 000000031f353f17 06-15 04:19:50.637 430 430 F DEBUG : x16 00000072c251a0d8 x17 00000072c24c6f00 x18 000000701078c000 x19 0000000000000010 06-15 04:19:50.637 430 430 F DEBUG : x20 00000070108fdd18 x21 0000007178a68578 x22 0000000000000089 x23 00000070108ff860 06-15 04:19:50.637 430 430 F DEBUG : x24 00000070108ff8c0 x25 0000000000000000 x26 00000000ee3c61b7 x27 0000000000000001 06-15 04:19:50.637 430 430 F DEBUG : x28 0000000000000000 x29 00000070108fdd30 06-15 04:19:50.637 430 430 F DEBUG : lr 00000072c249f5b8 sp 00000070108fdd10 pc 00000072c24c6f24 pst 0000000060001000 06-15 04:19:50.637 430 430 F DEBUG : 18 total frames 06-15 04:19:50.637 430 430 F DEBUG : backtrace: 06-15 04:19:50.637 430 430 F DEBUG : #00 pc 000000000008bf24 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+36) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:19:50.637 430 430 F DEBUG : #01 pc 00000000000645b4 /apex/com.android.runtime/lib64/bionic/libc.so (__futex_wait_ex(void volatile*, bool, int, bool, timespec const*)+148) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:19:50.637 430 430 F DEBUG : #02 pc 0000000000072f48 /apex/com.android.runtime/lib64/bionic/libc.so (pthread_cond_timedwait+136) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:19:50.637 430 430 F DEBUG : #03 pc 00000000000b0aec /apex/com.android.art/lib64/libc++.so (std::__1::condition_variable::__do_timed_wait(std::__1::unique_lock<std::__1::mutex>&, std::__1::chrono::time_point<std::__1::chrono::system_clock, std::__1::chrono::duration<long long, std::__1::ratio<1l, 1000000000l>>>)+96) (BuildId: 53e0091d25a788802d2d3a5324f79b527df4913f) 06-15 04:19:50.637 430 430 F DEBUG : #04 pc 0000000000093530 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadEntry::Wait(unwindstack::WaitType)+140) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:19:50.637 430 430 F DEBUG : #05 pc 00000000000939e4 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadUnwinder::SendSignalToThread(int, int)+296) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:19:50.637 430 430 F DEBUG : #06 pc 0000000000093bec /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadUnwinder::UnwindWithSignal(int, int, std::__1::unique_ptr<unwindstack::Regs, std::__1::default_delete<unwindstack::Regs>>*, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>> const*, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>> const*)+104) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:19:50.637 430 430 F DEBUG : #07 pc 00000000000606f4 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::AndroidLocalUnwinder::InternalUnwind(std::__1::optional<int>, unwindstack::AndroidUnwinderData&)+364) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:19:50.637 430 430 F DEBUG : #08 pc 00000000007a3be0 /apex/com.android.art/lib64/libart.so (art::DumpNativeStack(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, unwindstack::AndroidLocalUnwinder&, int, char const*, art::ArtMethod*, void*, bool)+184) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #09 pc 000000000087fe1c /apex/com.android.art/lib64/libart.so (art::Thread::DumpStack(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, unwindstack::AndroidLocalUnwinder&, bool, bool) const+360) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #10 pc 00000000008a21d0 /apex/com.android.art/lib64/libart.so (art::DumpCheckpoint::Run(art::Thread*)+1204) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #11 pc 000000000089932c /apex/com.android.art/lib64/libart.so (art::ThreadList::RunCheckpoint(art::Closure*, art::Closure*, bool, bool)+2964) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #12 pc 0000000000897e10 /apex/com.android.art/lib64/libart.so (art::ThreadList::Dump(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, bool)+920) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #13 pc 0000000000897a1c /apex/com.android.art/lib64/libart.so (art::ThreadList::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+1436) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #14 pc 0000000000848318 /apex/com.android.art/lib64/libart.so (art::Runtime::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+60) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #15 pc 0000000000869f38 /apex/com.android.art/lib64/libart.so (art::SignalCatcher::Run(void*)+5484) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:19:50.637 430 430 F DEBUG : #16 pc 0000000000073cd0 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+204) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:19:50.637 430 430 F DEBUG : #17 pc 0000000000065bb0 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:25:49.660 3607 3607 F libc : Fatal signal 6 (SIGABRT), code 128 (SI_KERNEL) in tid 3607 (system_server), pid 3607 (system_server) 06-15 04:25:50.946 11521 11521 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 06-15 04:25:50.946 11521 11521 F DEBUG : Build fingerprint: 'unknown/pacific/pacific:15/0.8.0.0/94:user/test-keys' 06-15 04:25:50.946 11521 11521 F DEBUG : Revision: '0' 06-15 04:25:50.946 11521 11521 F DEBUG : ABI: 'arm64' 06-15 04:25:50.946 11521 11521 F DEBUG : Timestamp: 2025-06-15 04:25:50.026572831+0800 06-15 04:25:50.946 11521 11521 F DEBUG : Process uptime: 1036s 06-15 04:25:50.946 11521 11521 F DEBUG : Cmdline: system_server 06-15 04:25:50.946 11521 11521 F DEBUG : pid: 3607, tid: 3607, name: system_server >>> system_server <<< 06-15 04:25:50.946 11521 11521 F DEBUG : uid: 1000 06-15 04:25:50.946 11521 11521 F DEBUG : tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE) 06-15 04:25:50.946 11521 11521 F DEBUG : pac_enabled_keys: 000000000000000f (PR_PAC_APIAKEY, PR_PAC_APIBKEY, PR_PAC_APDAKEY, PR_PAC_APDBKEY) 06-15 04:25:50.946 11521 11521 F DEBUG : signal 6 (SIGABRT), code 128 (SI_KERNEL), fault addr -------- 06-15 04:25:50.946 11521 11521 F DEBUG : x0 b4000071b8893c60 x1 0000000000000080 x2 00000000000007d3 x3 0000000000000000 06-15 04:25:50.947 11521 11521 F DEBUG : x4 0000000000000000 x5 0000000000000000 x6 0000000000000000 x7 000000702465b92c 06-15 04:25:50.947 11521 11521 F DEBUG : x8 0000000000000062 x9 aae30b8c88d99a37 x10 0000000000000001 x11 0000000000008001 06-15 04:25:50.947 11521 11521 F DEBUG : x12 0000000000000004 x13 000000007fffffff x14 0000000000056ae6 x15 0000000344f4d1c1 06-15 04:25:50.947 11521 11521 F DEBUG : x16 0000007024a22a20 x17 00000072c24c6f00 x18 00000072efc34000 x19 b4000071b8893c50 06-15 04:25:50.947 11521 11521 F DEBUG : x20 0000000000000000 x21 00000000000007d3 x22 0000007024c0e390 x23 00000070240537d5 06-15 04:25:50.947 11521 11521 F DEBUG : x24 00000072eef598c0 x25 0000000000000001 x26 00000072eef598c0 x27 0000000000fffff7 06-15 04:25:50.947 11521 11521 F DEBUG : x28 b4000070a88b5e80 x29 0000007ff5cfcd50 06-15 04:25:50.947 11521 11521 F DEBUG : lr 000000702447cb68 sp 0000007ff5cfcd40 pc 00000072c24c6f20 pst 0000000060001000 06-15 04:25:50.947 11521 11521 F DEBUG : 33 total frames 06-15 04:25:50.947 11521 11521 F DEBUG : backtrace: 06-15 04:25:50.947 11521 11521 F DEBUG : #00 pc 000000000008bf20 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+32) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:25:50.947 11521 11521 F DEBUG : #01 pc 000000000047cb64 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+140) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #02 pc 000000000066c880 /apex/com.android.art/lib64/libart.so (art::JNI<false>::CallObjectMethodV(_JNIEnv*, _jobject*, _jmethodID*, std::__va_list)+492) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #03 pc 00000000000df234 /system/lib64/libandroid_runtime.so (_JNIEnv::CallObjectMethod(_jobject*, _jmethodID*, ...)+124) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:25:50.947 11521 11521 F DEBUG : #04 pc 00000000001fe4b4 /system/lib64/libandroid_runtime.so ((anonymous namespace)::Receiver::handleEvent(int, int, void*)+100) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:25:50.947 11521 11521 F DEBUG : #05 pc 00000000000142e8 /system/lib64/libutils.so (android::Looper::pollInner(int)+1236) (BuildId: bb46aaa986a05e541482395c328d50a0) 06-15 04:25:50.947 11521 11521 F DEBUG : #06 pc 0000000000013db0 /system/lib64/libutils.so (android::Looper::pollOnce(int, int*, int*, void**)+124) (BuildId: bb46aaa986a05e541482395c328d50a0) 06-15 04:25:50.947 11521 11521 F DEBUG : #07 pc 000000000019e220 /system/lib64/libandroid_runtime.so (android::android_os_MessageQueue_nativePollOnce(_JNIEnv*, _jobject*, long, int)+48) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:25:50.947 11521 11521 F DEBUG : #08 pc 000000000037fd40 [anon_shmem:dalvik-jit-code-cache] (offset 0x2000000) (art_jni_trampoline+112) 06-15 04:25:50.947 11521 11521 F DEBUG : #09 pc 0000000000034e58 [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.MessageQueue.next+264) 06-15 04:25:50.947 11521 11521 F DEBUG : #10 pc 00000000008e10ec [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.Looper.loopOnce+92) 06-15 04:25:50.947 11521 11521 F DEBUG : #11 pc 000000000005a4ec [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.Looper.loop+252) 06-15 04:25:50.947 11521 11521 F DEBUG : #12 pc 0000000000209408 /apex/com.android.art/lib64/libart.so (nterp_helper+152) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #13 pc 00000000002a2e80 /system/framework/services.jar (com.android.server.SystemServer.run+1276) 06-15 04:25:50.947 11521 11521 F DEBUG : #14 pc 000000000020a2c4 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #15 pc 00000000002a27ea /system/framework/services.jar (com.android.server.SystemServer.main+10) 06-15 04:25:50.947 11521 11521 F DEBUG : #16 pc 0000000000210a40 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+640) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #17 pc 0000000000472050 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+216) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #18 pc 000000000082cf80 /apex/com.android.art/lib64/libart.so (_jobject* art::InvokeMethod<(art::PointerSize)8>(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, _jobject*, _jobject*, unsigned long)+2108) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #19 pc 00000000007995a8 /apex/com.android.art/lib64/libart.so (art::Method_invoke(_JNIEnv*, _jobject*, _jobject*, _jobjectArray*) (.__uniq.165753521025965369065708152063621506277)+36) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #20 pc 0000000000226f70 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+144) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #21 pc 000000000020a320 /apex/com.android.art/lib64/libart.so (nterp_helper+4016) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #22 pc 000000000020d112 /system/framework/framework.jar (com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run+18) 06-15 04:25:50.947 11521 11521 F DEBUG : #23 pc 000000000020b0e4 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #24 pc 0000000000211a06 /system/framework/framework.jar (com.android.internal.os.ZygoteInit.main+558) 06-15 04:25:50.947 11521 11521 F DEBUG : #25 pc 0000000000210a40 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+640) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #26 pc 0000000000472050 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+216) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #27 pc 000000000082d918 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeWithVarArgs<_jmethodID*>(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, _jmethodID*, std::__va_list)+472) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #28 pc 00000000006f4248 /apex/com.android.art/lib64/libart.so (art::JNI<true>::CallStaticVoidMethodV(_JNIEnv*, _jclass*, _jmethodID*, std::__va_list)+560) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:25:50.947 11521 11521 F DEBUG : #29 pc 00000000000e2f9c /system/lib64/libandroid_runtime.so (_JNIEnv::CallStaticVoidMethod(_jclass*, _jmethodID*, ...)+108) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:25:50.947 11521 11521 F DEBUG : #30 pc 00000000000fa244 /system/lib64/libandroid_runtime.so (android::AndroidRuntime::start(char const*, android::Vector<android::String8> const&, bool)+916) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:25:50.947 11521 11521 F DEBUG : #31 pc 00000000000047d8 /system/bin/app_process64 (main+1816) (BuildId: a3e8d583af2cdcff29751370d5826827) 06-15 04:25:50.947 11521 11521 F DEBUG : #32 pc 000000000005b9e4 /apex/com.android.runtime/lib64/bionic/libc.so (__libc_init+120) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:26:31.344 4216 4216 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.344 4216 4216 E AndroidRuntime: Process: com.obric.assistant:interactor, PID: 4216 06-15 04:26:31.344 4216 4216 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.345 6667 13825 E AndroidRuntime: FATAL EXCEPTION: DataStallThread 06-15 04:26:31.345 6667 13825 E AndroidRuntime: Process: com.bytedance.radioservice, PID: 6667 06-15 04:26:31.345 6667 13825 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.347 12431 12539 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#5 06-15 04:26:31.347 12431 12539 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.347 12431 12539 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.361 12431 12528 E AndroidRuntime: FATAL EXCEPTION: [GT]HotPool#3 06-15 04:26:31.361 12431 12528 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.361 12431 12528 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.371 12431 12181 E AndroidRuntime: FATAL EXCEPTION: [GT]HotPool#8 06-15 04:26:31.371 12431 12181 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.371 12431 12181 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.376 4068 4068 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.376 4068 4068 E AndroidRuntime: Process: com.android.systemui, PID: 4068 06-15 04:26:31.376 4068 4068 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.392 5764 8003 E AndroidRuntime: FATAL EXCEPTION: long-time-task-thread-4 06-15 04:26:31.392 5764 8003 E AndroidRuntime: Process: com.obric.memorydata, PID: 5764 06-15 04:26:31.392 5764 8003 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.393 26358 26358 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.393 26358 26358 E AndroidRuntime: Process: com.obric.feedback, PID: 26358 06-15 04:26:31.393 26358 26358 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.394 20846 20846 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.394 20846 20846 E AndroidRuntime: Process: com.android.launcher3, PID: 20846 06-15 04:26:31.394 20846 20846 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.398 11069 11069 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.398 11069 11069 E AndroidRuntime: Process: com.obric.mediametadataservice, PID: 11069 06-15 04:26:31.398 11069 11069 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.401 6234 7313 E AndroidRuntime: FATAL EXCEPTION: WsSurvivalHelper 06-15 04:26:31.401 6234 7313 E AndroidRuntime: Process: com.obric.matrix, PID: 6234 06-15 04:26:31.401 6234 7313 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.408 31180 31369 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#6 06-15 04:26:31.408 31180 31369 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.408 31180 31369 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.427 31180 31180 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.427 31180 31180 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.427 31180 31180 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.435 31180 31389 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#15 06-15 04:26:31.435 31180 31389 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.435 31180 31389 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.436 31180 31280 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#2 06-15 04:26:31.436 31180 31280 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.436 31180 31280 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.436 31180 31386 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#14 06-15 04:26:31.436 31180 31386 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.436 31180 31386 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.443 4535 4535 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.443 4535 4535 E AndroidRuntime: Process: com.android.phone, PID: 4535 06-15 04:26:31.443 4535 4535 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.450 12431 12558 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#8 06-15 04:26:31.450 12431 12558 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.450 12431 12558 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.452 12069 12069 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.452 12069 12069 E AndroidRuntime: Process: com.obric.weather, PID: 12069 06-15 04:26:31.452 12069 12069 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.456 12431 12697 E AndroidRuntime: FATAL EXCEPTION: default_matrix_thread 06-15 04:26:31.456 12431 12697 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.456 12431 12697 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.457 12431 12522 E AndroidRuntime: FATAL EXCEPTION: [GT]HotPool#0 06-15 04:26:31.457 12431 12522 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.457 12431 12522 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.458 6025 7427 E AndroidRuntime: FATAL EXCEPTION: Thread-6 06-15 04:26:31.458 6025 7427 E AndroidRuntime: Process: com.obric.cae, PID: 6025 06-15 04:26:31.458 6025 7427 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.470 31180 31408 E AndroidRuntime: FATAL EXCEPTION: default_matrix_thread 06-15 04:26:31.470 31180 31408 E AndroidRuntime: Process: com.tencent.mm:push, PID: 31180 06-15 04:26:31.470 31180 31408 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.470 12431 12530 E AndroidRuntime: FATAL EXCEPTION: [GT]HotPool#5 06-15 04:26:31.470 12431 12530 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.470 12431 12530 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.490 12431 12431 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:26:31.490 12431 12431 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.490 12431 12431 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.495 12431 12586 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#13 06-15 04:26:31.495 12431 12586 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.495 12431 12586 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:26:31.502 12431 12532 E AndroidRuntime: FATAL EXCEPTION: [GT]HotPool#6 06-15 04:26:31.502 12431 12532 E AndroidRuntime: Process: com.tencent.mm, PID: 12431 06-15 04:26:31.502 12431 12532 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:27:20.283 18660 18660 F libc : Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 18660 (init), pid 18660 (init) 06-15 04:27:20.325 18660 18660 F libc : crash_dump helper failed to exec, or was killed 06-15 04:34:31.899 13872 13872 F libc : Fatal signal 6 (SIGABRT), code 128 (SI_KERNEL) in tid 13872 (system_server), pid 13872 (system_server) 06-15 04:34:33.102 6200 6200 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 06-15 04:34:33.102 6200 6200 F DEBUG : Build fingerprint: 'unknown/pacific/pacific:15/0.8.0.0/94:user/test-keys' 06-15 04:34:33.102 6200 6200 F DEBUG : Revision: '0' 06-15 04:34:33.102 6200 6200 F DEBUG : ABI: 'arm64' 06-15 04:34:33.102 6200 6200 F DEBUG : Timestamp: 2025-06-15 04:34:32.331351694+0800 06-15 04:34:33.102 6200 6200 F DEBUG : Process uptime: 470s 06-15 04:34:33.102 6200 6200 F DEBUG : Cmdline: system_server 06-15 04:34:33.102 6200 6200 F DEBUG : pid: 13872, tid: 13872, name: system_server >>> system_server <<< 06-15 04:34:33.102 6200 6200 F DEBUG : uid: 1000 06-15 04:34:33.102 6200 6200 F DEBUG : tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE) 06-15 04:34:33.102 6200 6200 F DEBUG : pac_enabled_keys: 000000000000000f (PR_PAC_APIAKEY, PR_PAC_APIBKEY, PR_PAC_APDAKEY, PR_PAC_APDBKEY) 06-15 04:34:33.102 6200 6200 F DEBUG : signal 6 (SIGABRT), code 128 (SI_KERNEL), fault addr -------- 06-15 04:34:33.102 6200 6200 F DEBUG : x0 fffffffffffffffc x1 0000007fee5feea0 x2 0000000000000010 x3 0000000000002710 06-15 04:34:33.102 6200 6200 F DEBUG : x4 0000000000000000 x5 0000000000000008 x6 0000007fee5fdd30 x7 0006f4f48f1df93c 06-15 04:34:33.102 6200 6200 F DEBUG : x8 0000000000000016 x9 ffffffffffb9b170 x10 0000000000000009 x11 00000000000d800b 06-15 04:34:33.102 6200 6200 F DEBUG : x12 0000000000000006 x13 0000000000000005 x14 00000000000be490 x15 00000000ebad6a89 06-15 04:34:33.102 6200 6200 F DEBUG : x16 0000007488763c40 x17 000000747d31cf30 x18 0000007493608000 x19 b40000737fa9dd50 06-15 04:34:33.102 6200 6200 F DEBUG : x20 0000000000002710 x21 0000007fee5feea0 x22 0000000000002710 x23 0000007492d8c8c0 06-15 04:34:33.102 6200 6200 F DEBUG : x24 0000000000000030 x25 000000007fffffff x26 0000000000000001 x27 0000000000000006 06-15 04:34:33.102 6200 6200 F DEBUG : x28 0000007fee5ff090 x29 0000007fee5fefc0 06-15 04:34:33.102 6200 6200 F DEBUG : lr 0000007488756edc sp 0000007fee5fee60 pc 000000747d36a78c pst 0000000080001000 06-15 04:34:33.102 6200 6200 F DEBUG : 29 total frames 06-15 04:34:33.102 6200 6200 F DEBUG : backtrace: 06-15 04:34:33.102 6200 6200 F DEBUG : #00 pc 00000000000ca78c /apex/com.android.runtime/lib64/bionic/libc.so (__epoll_pwait+12) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:34:33.102 6200 6200 F DEBUG : #01 pc 0000000000013ed8 /system/lib64/libutils.so (android::Looper::pollInner(int)+196) (BuildId: bb46aaa986a05e541482395c328d50a0) 06-15 04:34:33.102 6200 6200 F DEBUG : #02 pc 0000000000013db0 /system/lib64/libutils.so (android::Looper::pollOnce(int, int*, int*, void**)+124) (BuildId: bb46aaa986a05e541482395c328d50a0) 06-15 04:34:33.102 6200 6200 F DEBUG : #03 pc 000000000019e220 /system/lib64/libandroid_runtime.so (android::android_os_MessageQueue_nativePollOnce(_JNIEnv*, _jobject*, long, int)+48) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:34:33.102 6200 6200 F DEBUG : #04 pc 0000000000226f70 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+144) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #05 pc 0000000000035038 [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.MessageQueue.next+264) 06-15 04:34:33.102 6200 6200 F DEBUG : #06 pc 0000000000910abc [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.Looper.loopOnce+92) 06-15 04:34:33.102 6200 6200 F DEBUG : #07 pc 000000000005b9fc [anon_shmem:dalvik-zygote-jit-code-cache] (offset 0x2000000) (android.os.Looper.loop+252) 06-15 04:34:33.102 6200 6200 F DEBUG : #08 pc 0000000000209408 /apex/com.android.art/lib64/libart.so (nterp_helper+152) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #09 pc 00000000002a2e80 /system/framework/services.jar (com.android.server.SystemServer.run+1276) 06-15 04:34:33.102 6200 6200 F DEBUG : #10 pc 000000000020a2c4 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #11 pc 00000000002a27ea /system/framework/services.jar (com.android.server.SystemServer.main+10) 06-15 04:34:33.102 6200 6200 F DEBUG : #12 pc 0000000000210a40 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+640) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #13 pc 0000000000472050 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+216) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #14 pc 000000000082cf80 /apex/com.android.art/lib64/libart.so (_jobject* art::InvokeMethod<(art::PointerSize)8>(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, _jobject*, _jobject*, unsigned long)+2108) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #15 pc 00000000007995a8 /apex/com.android.art/lib64/libart.so (art::Method_invoke(_JNIEnv*, _jobject*, _jobject*, _jobjectArray*) (.__uniq.165753521025965369065708152063621506277)+36) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #16 pc 0000000000226f70 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+144) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #17 pc 000000000020a320 /apex/com.android.art/lib64/libart.so (nterp_helper+4016) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #18 pc 000000000020d112 /system/framework/framework.jar (com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run+18) 06-15 04:34:33.102 6200 6200 F DEBUG : #19 pc 000000000020b0e4 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #20 pc 0000000000211a06 /system/framework/framework.jar (com.android.internal.os.ZygoteInit.main+558) 06-15 04:34:33.102 6200 6200 F DEBUG : #21 pc 0000000000210a40 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+640) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #22 pc 0000000000472050 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+216) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #23 pc 000000000082d918 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeWithVarArgs<_jmethodID*>(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, _jmethodID*, std::__va_list)+472) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #24 pc 00000000006f4248 /apex/com.android.art/lib64/libart.so (art::JNI<true>::CallStaticVoidMethodV(_JNIEnv*, _jclass*, _jmethodID*, std::__va_list)+560) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:34:33.102 6200 6200 F DEBUG : #25 pc 00000000000e2f9c /system/lib64/libandroid_runtime.so (_JNIEnv::CallStaticVoidMethod(_jclass*, _jmethodID*, ...)+108) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:34:33.102 6200 6200 F DEBUG : #26 pc 00000000000fa244 /system/lib64/libandroid_runtime.so (android::AndroidRuntime::start(char const*, android::Vector<android::String8> const&, bool)+916) (BuildId: 8be94ccb8d309e803b6ab32930a3b12b) 06-15 04:34:33.102 6200 6200 F DEBUG : #27 pc 00000000000047d8 /system/bin/app_process64 (main+1816) (BuildId: a3e8d583af2cdcff29751370d5826827) 06-15 04:34:33.102 6200 6200 F DEBUG : #28 pc 000000000005b9e4 /apex/com.android.runtime/lib64/bionic/libc.so (__libc_init+120) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:34:33.511 23497 24117 E AndroidRuntime: FATAL EXCEPTION: DatabaseSyncService 06-15 04:34:33.511 23497 24117 E AndroidRuntime: Process: com.smartisanos.gallery, PID: 23497 06-15 04:34:33.511 23497 24117 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.519 19416 31631 E AndroidRuntime: FATAL EXCEPTION: DataStallThread 06-15 04:34:33.519 19416 31631 E AndroidRuntime: Process: com.bytedance.radioservice, PID: 19416 06-15 04:34:33.519 19416 31631 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.525 23759 23759 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:34:33.525 23759 23759 E AndroidRuntime: Process: com.obric.mediametadataservice, PID: 23759 06-15 04:34:33.525 23759 23759 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.562 27764 27764 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:34:33.562 27764 27764 E AndroidRuntime: Process: com.android.systemui, PID: 27764 06-15 04:34:33.562 27764 27764 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.570 17071 17071 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:34:33.570 17071 17071 E AndroidRuntime: Process: com.qti.qcc, PID: 17071 06-15 04:34:33.570 17071 17071 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.588 30577 30748 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#10 06-15 04:34:33.588 30577 30748 E AndroidRuntime: Process: com.tencent.mm, PID: 30577 06-15 04:34:33.588 30577 30748 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.619 20917 20917 E AndroidRuntime: FATAL EXCEPTION: main 06-15 04:34:33.619 20917 20917 E AndroidRuntime: Process: com.android.providers.weather, PID: 20917 06-15 04:34:33.619 20917 20917 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.641 29760 30119 E AndroidRuntime: FATAL EXCEPTION: WM.task-4 06-15 04:34:33.641 29760 30119 E AndroidRuntime: Process: com.android.rkpdapp, PID: 29760 06-15 04:34:33.641 29760 30119 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.644 16591 22658 E AndroidRuntime: FATAL EXCEPTION: pool-12-thread-1 06-15 04:34:33.644 16591 22658 E AndroidRuntime: Process: com.bytedance.os.mermaid, PID: 16591 06-15 04:34:33.644 16591 22658 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:34:33.670 30577 30776 E AndroidRuntime: FATAL EXCEPTION: [GT]ColdPool#14 06-15 04:34:33.670 30577 30776 E AndroidRuntime: Process: com.tencent.mm, PID: 30577 06-15 04:34:33.670 30577 30776 E AndroidRuntime: DeadSystemException: The system died; earlier logs will point to the root cause 06-15 04:35:16.613 11852 11852 F libc : Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 11852 (init), pid 11852 (init) 06-15 04:35:16.682 11852 11852 F libc : crash_dump helper failed to exec, or was killed 06-15 04:36:05.261 12659 12670 F libc : Fatal signal 6 (SIGABRT), code -6 (SI_TKILL) in tid 12670 (Signal Catcher), pid 12659 (com.obric.cae) 06-15 04:36:07.796 10043 10052 F libc : Fatal signal 6 (SIGABRT), code -6 (SI_TKILL) in tid 10052 (Signal Catcher), pid 10043 (ndroid.systemui) 06-15 04:36:09.096 16114 16114 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 06-15 04:36:09.096 16114 16114 F DEBUG : Build fingerprint: 'unknown/pacific/pacific:15/0.8.0.0/94:user/test-keys' 06-15 04:36:09.096 16114 16114 F DEBUG : Revision: '0' 06-15 04:36:09.096 16114 16114 F DEBUG : ABI: 'arm64' 06-15 04:36:09.096 16114 16114 F DEBUG : Timestamp: 2025-06-15 04:36:06.131456971+0800 06-15 04:36:09.096 16114 16114 F DEBUG : Process uptime: 46s 06-15 04:36:09.096 16114 16114 F DEBUG : Cmdline: com.obric.cae 06-15 04:36:09.096 16114 16114 F DEBUG : pid: 12659, tid: 12670, name: Signal Catcher >>> com.obric.cae <<< 06-15 04:36:09.096 16114 16114 F DEBUG : uid: 1000 06-15 04:36:09.096 16114 16114 F DEBUG : tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE) 06-15 04:36:09.096 16114 16114 F DEBUG : pac_enabled_keys: 000000000000000f (PR_PAC_APIAKEY, PR_PAC_APIBKEY, PR_PAC_APDAKEY, PR_PAC_APDBKEY) 06-15 04:36:09.096 16114 16114 F DEBUG : signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr -------- 06-15 04:36:09.096 16114 16114 F DEBUG : Abort message: 'Caused BootImagePollingThread failure : SuspendAll timeout: Unsuspended threads: Thread[2,tid=12670,Runnable,Thread*=0xb400007073e4ff50,peer=0x146408b0,"Signal Catcher"], Info for Thread[2,tid=12670,Runnable,Thread*=0xb400007073e4ff50,peer=0x146408b0,"Signal Catcher"]:Signal Catcher tid: 12670, state&flags: 0x9, priority: 10, barrier value: 1, Target states: [12670 (Signal Catcher) D 6404 6404 0 0 -1 4194368 7865 0 0 0 28 11 0 0 0 -20 31 0 161685 , 12670 (Signal Catcher) D 6404 6404 0 0 -1 4194368 7907 0 0 0 29 11 0 0 0 -20 31 0 161685 ]1@474460762748 Final wait time: 1.041s' 06-15 04:36:09.096 16114 16114 F DEBUG : x0 000000000000005f x1 0000006e780051f8 x2 0000000000001000 x3 0000000000000000 06-15 04:36:09.096 16114 16114 F DEBUG : x4 0000006e78005d24 x5 b400007063ecaf6c x6 65732f636f72702f x7 2f6b7361742f666c 06-15 04:36:09.096 16114 16114 F DEBUG : x8 000000000000003f x9 0000000000000000 x10 00000000ece8b1a8 x11 00000000ace540c2 06-15 04:36:09.096 16114 16114 F DEBUG : x12 373632312f6b7361 x13 70756f7267632f34 x14 0000000000000000 x15 0000000000000000 06-15 04:36:09.096 16114 16114 F DEBUG : x16 00000071428f0650 x17 00000071417024a0 x18 0000006e72c44000 x19 0000006e78006340 06-15 04:36:09.096 16114 16114 F DEBUG : x20 0000000000000077 x21 0000006e780078c0 x22 0000000000000077 x23 0000006e780078c0 06-15 04:36:09.096 16114 16114 F DEBUG : x24 0000006e78006329 x25 0000006e78006548 x26 000000712dfa65f0 x27 000000712dfa65b0 06-15 04:36:09.097 16114 16114 F DEBUG : x28 0000006e78006328 x29 0000006e78006200 06-15 04:36:09.097 16114 16114 F DEBUG : lr 00000071428d2274 sp 0000006e78005170 pc 00000071417024ac pst 0000000080001000 06-15 04:36:09.097 16114 16114 F DEBUG : 12 total frames 06-15 04:36:09.097 16114 16114 F DEBUG : backtrace: 06-15 04:36:09.097 16114 16114 F DEBUG : #00 pc 00000000000c94ac /apex/com.android.runtime/lib64/bionic/libc.so (read+12) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:09.097 16114 16114 F DEBUG : #01 pc 0000000000012270 /apex/com.android.art/lib64/libbase.so (android::base::ReadFdToString(android::base::borrowed_fd, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>*)+280) (BuildId: 8367396248ab14cf4164f2cfe0829082) 06-15 04:36:09.097 16114 16114 F DEBUG : #02 pc 00000000000123e0 /apex/com.android.art/lib64/libbase.so (android::base::ReadFileToString(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>*, bool)+192) (BuildId: 8367396248ab14cf4164f2cfe0829082) 06-15 04:36:09.097 16114 16114 F DEBUG : #03 pc 0000000000883f68 /apex/com.android.art/lib64/libart.so (art::Thread::DumpState(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, art::Thread const*, int)+1764) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #04 pc 00000000008a21b8 /apex/com.android.art/lib64/libart.so (art::DumpCheckpoint::Run(art::Thread*)+1180) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #05 pc 000000000089932c /apex/com.android.art/lib64/libart.so (art::ThreadList::RunCheckpoint(art::Closure*, art::Closure*, bool, bool)+2964) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #06 pc 0000000000897e10 /apex/com.android.art/lib64/libart.so (art::ThreadList::Dump(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, bool)+920) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #07 pc 0000000000897a1c /apex/com.android.art/lib64/libart.so (art::ThreadList::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+1436) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #08 pc 0000000000848318 /apex/com.android.art/lib64/libart.so (art::Runtime::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+60) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #09 pc 0000000000869f38 /apex/com.android.art/lib64/libart.so (art::SignalCatcher::Run(void*)+5484) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:09.097 16114 16114 F DEBUG : #10 pc 0000000000073cd0 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+204) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:09.097 16114 16114 F DEBUG : #11 pc 0000000000065bb0 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:10.991 16353 16353 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 06-15 04:36:10.991 16353 16353 F DEBUG : Build fingerprint: 'unknown/pacific/pacific:15/0.8.0.0/94:user/test-keys' 06-15 04:36:10.991 16353 16353 F DEBUG : Revision: '0' 06-15 04:36:10.991 16353 16353 F DEBUG : ABI: 'arm64' 06-15 04:36:10.991 16353 16353 F DEBUG : Timestamp: 2025-06-15 04:36:09.222489157+0800 06-15 04:36:10.991 16353 16353 F DEBUG : Process uptime: 65s 06-15 04:36:10.991 16353 16353 F DEBUG : Cmdline: com.android.systemui 06-15 04:36:10.991 16353 16353 F DEBUG : pid: 10043, tid: 10052, name: Signal Catcher >>> com.android.systemui <<< 06-15 04:36:10.991 16353 16353 F DEBUG : uid: 10147 06-15 04:36:10.991 16353 16353 F DEBUG : tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE) 06-15 04:36:10.991 16353 16353 F DEBUG : pac_enabled_keys: 000000000000000f (PR_PAC_APIAKEY, PR_PAC_APIBKEY, PR_PAC_APDAKEY, PR_PAC_APDBKEY) 06-15 04:36:10.991 16353 16353 F DEBUG : signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr -------- 06-15 04:36:10.991 16353 16353 F DEBUG : Abort message: 'Caused BootImagePollingThread failure : SuspendAll timeout: Unsuspended threads: Thread[2,tid=10052,Runnable,Thread*=0xb400007073e552c0,peer=0x161c2338,"Signal Catcher"], Info for Thread[2,tid=10052,Runnable,Thread*=0xb400007073e552c0,peer=0x161c2338,"Signal Catcher"]:Signal Catcher tid: 10052, state&flags: 0x9, priority: 10, barrier value: 1, Target states: [10052 (Signal Catcher) D 6404 6404 0 0 -1 4194368 72694 715 0 0 864 114 0 1 0 -20 99 0 16, 10052 (Signal Catcher) S 6404 6404 0 0 -1 4194368 72694 715 0 0 864 114 0 1 0 -20 99 0 16]1@474460861052 Final wait time: 1.014s' 06-15 04:36:10.991 16353 16353 F DEBUG : x0 fffffffffffffffc x1 0000000000000089 x2 0000000000000010 x3 0000006e7701dd18 06-15 04:36:10.991 16353 16353 F DEBUG : x4 0000000000000000 x5 00000000ffffffff x6 00000000ffffffff x7 7365786574756d20 06-15 04:36:10.991 16353 16353 F DEBUG : x8 0000000000000062 x9 aacc454f7c510c3f x10 fffffffffffef005 x11 0000000031f4eed8 06-15 04:36:10.991 16353 16353 F DEBUG : x12 00000000684ddd36 x13 000000007fffffff x14 00000000000ca068 x15 000000077e673b1c 06-15 04:36:10.991 16353 16353 F DEBUG : x16 00000071417180d8 x17 00000071416c4f00 x18 0000006e73d6c000 x19 0000000000000010 06-15 04:36:10.991 16353 16353 F DEBUG : x20 0000006e7701dd18 x21 b4000070741074a8 x22 0000000000000089 x23 0000006e7701f860 06-15 04:36:10.991 16353 16353 F DEBUG : x24 0000006e7701f8c0 x25 0000000000000000 x26 00000000ee6a1a05 x27 0000000000000001 06-15 04:36:10.991 16353 16353 F DEBUG : x28 0000000000000000 x29 0000006e7701dd30 06-15 04:36:10.991 16353 16353 F DEBUG : lr 000000714169d5b8 sp 0000006e7701dd10 pc 00000071416c4f24 pst 0000000060001000 06-15 04:36:10.991 16353 16353 F DEBUG : 18 total frames 06-15 04:36:10.991 16353 16353 F DEBUG : backtrace: 06-15 04:36:10.991 16353 16353 F DEBUG : #00 pc 000000000008bf24 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+36) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:10.991 16353 16353 F DEBUG : #01 pc 00000000000645b4 /apex/com.android.runtime/lib64/bionic/libc.so (__futex_wait_ex(void volatile*, bool, int, bool, timespec const*)+148) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:10.991 16353 16353 F DEBUG : #02 pc 0000000000072f48 /apex/com.android.runtime/lib64/bionic/libc.so (pthread_cond_timedwait+136) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:10.991 16353 16353 F DEBUG : #03 pc 00000000000b0aec /apex/com.android.art/lib64/libc++.so (std::__1::condition_variable::__do_timed_wait(std::__1::unique_lock<std::__1::mutex>&, std::__1::chrono::time_point<std::__1::chrono::system_clock, std::__1::chrono::duration<long long, std::__1::ratio<1l, 1000000000l>>>)+96) (BuildId: 53e0091d25a788802d2d3a5324f79b527df4913f) 06-15 04:36:10.991 16353 16353 F DEBUG : #04 pc 0000000000093530 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadEntry::Wait(unwindstack::WaitType)+140) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:36:10.991 16353 16353 F DEBUG : #05 pc 00000000000939e4 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadUnwinder::SendSignalToThread(int, int)+296) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:36:10.991 16353 16353 F DEBUG : #06 pc 0000000000093bec /apex/com.android.art/lib64/libunwindstack.so (unwindstack::ThreadUnwinder::UnwindWithSignal(int, int, std::__1::unique_ptr<unwindstack::Regs, std::__1::default_delete<unwindstack::Regs>>*, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>> const*, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>> const*)+104) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:36:10.991 16353 16353 F DEBUG : #07 pc 00000000000606f4 /apex/com.android.art/lib64/libunwindstack.so (unwindstack::AndroidLocalUnwinder::InternalUnwind(std::__1::optional<int>, unwindstack::AndroidUnwinderData&)+364) (BuildId: c12353edf5bb03325316f4802d7fa4b4) 06-15 04:36:10.991 16353 16353 F DEBUG : #08 pc 00000000007a3be0 /apex/com.android.art/lib64/libart.so (art::DumpNativeStack(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, unwindstack::AndroidLocalUnwinder&, int, char const*, art::ArtMethod*, void*, bool)+184) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #09 pc 000000000087fe1c /apex/com.android.art/lib64/libart.so (art::Thread::DumpStack(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, unwindstack::AndroidLocalUnwinder&, bool, bool) const+360) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #10 pc 00000000008a21d0 /apex/com.android.art/lib64/libart.so (art::DumpCheckpoint::Run(art::Thread*)+1204) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #11 pc 000000000089932c /apex/com.android.art/lib64/libart.so (art::ThreadList::RunCheckpoint(art::Closure*, art::Closure*, bool, bool)+2964) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #12 pc 0000000000897e10 /apex/com.android.art/lib64/libart.so (art::ThreadList::Dump(std::__1::basic_ostream<char, std::__1::char_traits<char>>&, bool)+920) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #13 pc 0000000000897a1c /apex/com.android.art/lib64/libart.so (art::ThreadList::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+1436) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #14 pc 0000000000848318 /apex/com.android.art/lib64/libart.so (art::Runtime::DumpForSigQuit(std::__1::basic_ostream<char, std::__1::char_traits<char>>&)+60) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #15 pc 0000000000869f38 /apex/com.android.art/lib64/libart.so (art::SignalCatcher::Run(void*)+5484) (BuildId: 29a487f0c8088464e14dcbff6c86797f) 06-15 04:36:10.991 16353 16353 F DEBUG : #16 pc 0000000000073cd0 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+204) (BuildId: 5f5c1386426a2756c92c6d45ddc06654) 06-15 04:36:10.991 16353 16353 F DEBUG : #17 pc 0000000000065bb0 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 5f5c1386426a2756c92c6d45ddc06654)
06-19
(gdb) run The program being debugged has been started already. Start it from the beginning? (y or n) y Starting program: /home/562381/code/P_2023.05.06_IPCHisilicon_Vodka_RUB_SD/./mkconfig warning: Error disabling address space randomization: Operation not permitted /*************************************************************************************************/ Session : SESSION_Normal_Normal_ITOP_Dahua_Chn_PN SESSION_Normal_Normal_ITOP_Dahua_MultiLang_PN SESSION_Normal_Normal_ITOP_General_Chn_PN SESSION_Normal_Normal_ITOP_General_MultiLang_PN SESSION_Normal_Normal_LLM_ITOP_Dahua_Chn_PN SESSION_Normal_Normal_LLM_ITOP_Dahua_MultiLang_PN SESSION_Normal_Normal_LLM_ITOP_General_Chn_PN SESSION_Normal_Normal_LLM_ITOP_General_MultiLang_PN SESSION_LingDongPro_Normal_Dahua_Chn_PN SESSION_LingDongPro_Normal_Dahua_MultiLang_PN SESSION_LingDongPro_Normal_General_Chn_PN SESSION_LingDongPro_Normal_General_MultiLang_PN SESSION_XVR_ShuangPinXVR_ITOP_Dahua_Chn_PN SESSION_XVR_ShuangPinXVR_ITOP_Dahua_MultiLang_PN SESSION_XVR_ShuangPinXVR_ITOP_General_Chn_PN SESSION_XVR_ShuangPinXVR_ITOP_General_MultiLang_PN SESSION_Normal_Monitor_ITOP_Dahua_Chn_PN SESSION_Normal_Monitor_ITOP_General_Chn_PN SESSION_Normal_Traffic_ITOP_Dahua_Chn_PN SESSION_Normal_Traffic_ITOP_General_Chn_PN SESSION_Normal_Energy_ITOP_Dahua_Chn_PN SESSION_Normal_Energy_ITOP_General_Chn_PN SESSION_Normal_Water_ITOP_Dahua_Chn_PN SESSION_Normal_Water_ITOP_General_Chn_PN SESSION_Normal_Water_LLM_ITOP_Dahua_Chn_PN SESSION_Normal_Water_LLM_ITOP_General_Chn_PN SESSION_Normal_DHOP_ITOP_Dahua_MultiLang_PN SESSION_Normal_DHOP_ITOP_General_MultiLang_PN SESSION_XVR_ShuangPinXVR_GV_ITOP_Dahua_Chn_PN SESSION_XVR_ShuangPinXVR_GV_ITOP_General_Chn_PN SESSION_Normal_Normal_None_Dahua_Chn_PN SESSION_Normal_Normal_None_General_Chn_PN SESSION_XVR_ShuangPinXVR_Acupick_ITOP_Dahua_MultiLang_PN SESSION_XVR_ShuangPinXVR_Acupick_ITOP_General_MultiLang_PN SESSION_XVR_ShuangPinXVR_Acupick_LLMMul_ITOP_Dahua_MultiLang_PN SESSION_XVR_ShuangPinXVR_Acupick_LLMMul_ITOP_General_MultiLang_PN SESSION_XVR_ShuangPinXVR_Acupick_LLM_ITOP_General_MultiLang_PN SESSION_XVR_ShuangPinXVR_Acupick_LLM_ITOP_Dahua_MultiLang_PN SESSION_XVR_ShuangPinXVR_Energy_ITOP_Dahua_Chn_PN SESSION_XVR_ShuangPinXVR_Energy_ITOP_General_Chn_PN SESSION_XVR_ShuangPinXVR_LLM_ITOP_Dahua_Chn_PN SESSION_XVR_ShuangPinXVR_LLM_ITOP_Dahua_MultiLang_PN SESSION_XVR_ShuangPinXVR_LLM_ITOP_General_Chn_PN SESSION_XVR_ShuangPinXVR_LLM_ITOP_General_MultiLang_PN SESSION_XCC_Agent_Normal_Dahua_Chn_PN SESSION_XCC_Agent_Traffic_Dahua_Chn_PN Transaction : All uboot kernel dtb romfs pd web firmware itop Version : 3.200.0000028.0.R 3. [Detaching after fork from child process 68657] [Detaching after fork from child process 68658]***************************************************/ cfg_productDef address: 0x7fff102176b8 pure virtual method called terminate called without an active exception Program received signal SIGABRT, Aborted. 0x00007f463f17b387 in raise () from /lib64/libc.so.6 (gdb) bt full #0 0x00007f463f17b387 in raise () from /lib64/libc.so.6 No symbol table info available. #1 0x00007f463f17ca78 in abort () from /lib64/libc.so.6 No symbol table info available. #2 0x00007f463fa8ba95 in __gnu_cxx::__verbose_terminate_handler() () from /lib64/libstdc++.so.6 No symbol table info available. #3 0x00007f463fa89a06 in ?? () from /lib64/libstdc++.so.6 No symbol table info available. #4 0x00007f463fa89a33 in std::terminate() () from /lib64/libstdc++.so.6 No symbol table info available. #5 0x00007f463fa8a59f in __cxa_pure_virtual () from /lib64/libstdc++.so.6 No symbol table info available. #6 0x000000000041138a in Json::Value::~Value (this=0x1071e5d0, __in_chrg=<optimized out>) at src/lib_json/json_value.cpp:518 No locals. #7 0x0000000000411438 in std::pair<Json::Value::CZString const, Json::Value>::~pair (this=0x1071e5c0, __in_chrg=<optimized out>) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_pair.h:68 No locals. #8 0x0000000000411491 in destroy (__p=<optimized out>, this=<optimized out>) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/ext/new_allocator.h:115 No locals. #9 std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_destroy_node (this=<optimized out>, __p=0x1071e5a0) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:383 No locals. #10 0x00000000004114f9 in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e5a0) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:972 __y = 0x0 #11 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e550) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #12 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e500) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #13 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e4b0) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #14 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e460) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #15 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e410) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #16 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString cons---Type <return> to continue, or q <return> to quit--- t, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e3c0) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #17 0x00000000004114ea in std::_Rb_tree<Json::Value::CZString, std::pair<Json::Value::CZString const, Json::Value>, std::_Select1st<std::pair<Json::Value::CZString const, Json::Value> >, std::less<Json::Value::CZString>, std::allocator<std::pair<Json::Value::CZString const, Json::Value> > >::_M_erase (this=0x1071e330, __x=0x1071e370) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:970 __y = <optimized out> #18 0x00000000004113a0 in ~_Rb_tree (this=0x1071e330, __in_chrg=<optimized out>) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_tree.h:614 No locals. #19 ~map (this=0x1071e330, __in_chrg=<optimized out>) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_map.h:87 No locals. #20 Json::Value::~Value (this=0x107224a8, __in_chrg=<optimized out>) at src/lib_json/json_value.cpp:523 No locals. #21 0x0000000000403e57 in PareItemFest::~PareItemFest (this=0x10722450, __in_chrg=<optimized out>) at Src/mkconfig.cpp:50 No locals. #22 0x0000000000407cac in _Destroy<PareItemFest> (__pointer=0x10722450) at /usr/include/c++/4.8.2/bits/stl_construct.h:93 No locals. #23 std::_Destroy_aux<false>::__destroy<PareItemFest*> (__first=0x10722450, __last=0x10722850) at /usr/include/c++/4.8.2/bits/stl_construct.h:103 No locals. #24 0x0000000000407cf3 in _Destroy<PareItemFest*> (__last=<optimized out>, __first=<optimized out>) at /usr/include/c++/4.8.2/bits/stl_construct.h:126 No locals. #25 _Destroy<PareItemFest*, PareItemFest> (__last=<optimized out>, __first=<optimized out>) at /usr/include/c++/4.8.2/bits/stl_construct.h:151 No locals. #26 std::vector<PareItemFest, std::allocator<PareItemFest> >::~vector (this=0x61c310 <pareItemFest>, __in_chrg=<optimized out>) at /usr/include/c++/4.8.2/bits/stl_vector.h:415 No locals. #27 0x00007f463f17ece9 in __run_exit_handlers () from /lib64/libc.so.6 No symbol table info available. #28 0x00007f463f17ed37 in exit () from /lib64/libc.so.6 No symbol table info available. #29 0x00007f463f16755c in __libc_start_main () from /lib64/libc.so.6 No symbol table info available. #30 0x00000000004027b9 in _start () No symbol table info available.
最新发布
08-30
6.2 'hist' trigger examples --------------------------- The first set of examples creates aggregations using the kmalloc event. The fields that can be used for the hist trigger are listed in the kmalloc event's format file: # cat /sys/kernel/debug/tracing/events/kmem/kmalloc/format name: kmalloc ID: 374 format: field:unsigned short common_type; offset:0; size:2; signed:0; field:unsigned char common_flags; offset:2; size:1; signed:0; field:unsigned char common_preempt_count; offset:3; size:1; signed:0; field:int common_pid; offset:4; size:4; signed:1; field:unsigned long call_site; offset:8; size:8; signed:0; field:const void * ptr; offset:16; size:8; signed:0; field:size_t bytes_req; offset:24; size:8; signed:0; field:size_t bytes_alloc; offset:32; size:8; signed:0; field:gfp_t gfp_flags; offset:40; size:4; signed:0; We'll start by creating a hist trigger that generates a simple table that lists the total number of bytes requested for each function in the kernel that made one or more calls to kmalloc: # echo 'hist:key=call_site:val=bytes_req' > \ /sys/kernel/debug/tracing/events/kmem/kmalloc/trigger This tells the tracing system to create a 'hist' trigger using the call_site field of the kmalloc event as the key for the table, which just means that each unique call_site address will have an entry created for it in the table. The 'val=bytes_req' parameter tells the hist trigger that for each unique entry (call_site) in the table, it should keep a running total of the number of bytes requested by that call_site. We'll let it run for awhile and then dump the contents of the 'hist' file in the kmalloc event's subdirectory (for readability, a number of entries have been omitted): # cat /sys/kernel/debug/tracing/events/kmem/kmalloc/hist # trigger info: hist:keys=call_site:vals=bytes_req:sort=hitcount:size=2048 [active] { call_site: 18446744072106379007 } hitcount: 1 bytes_req: 176 { call_site: 18446744071579557049 } hitcount: 1 bytes_req: 1024 { call_site: 18446744071580608289 } hitcount: 1 bytes_req: 16384 { call_site: 18446744071581827654 } hitcount: 1 bytes_req: 24 { call_site: 18446744071580700980 } hitcount: 1 bytes_req: 8 { call_site: 18446744071579359876 } hitcount: 1 bytes_req: 152 { call_site: 18446744071580795365 } hitcount: 3 bytes_req: 144 { call_site: 18446744071581303129 } hitcount: 3 bytes_req: 144 { call_site: 18446744071580713234 } hitcount: 4 bytes_req: 2560 { call_site: 18446744071580933750 } hitcount: 4 bytes_req: 736 . . . { call_site: 18446744072106047046 } hitcount: 69 bytes_req: 5576 { call_site: 18446744071582116407 } hitcount: 73 bytes_req: 2336 { call_site: 18446744072106054684 } hitcount: 136 bytes_req: 140504 { call_site: 18446744072106224230 } hitcount: 136 bytes_req: 19584 { call_site: 18446744072106078074 } hitcount: 153 bytes_req: 2448 { call_site: 18446744072106062406 } hitcount: 153 bytes_req: 36720 { call_site: 18446744071582507929 } hitcount: 153 bytes_req: 37088 { call_site: 18446744072102520590 } hitcount: 273 bytes_req: 10920 { call_site: 18446744071582143559 } hitcount: 358 bytes_req: 716 { call_site: 18446744072106465852 } hitcount: 417 bytes_req: 56712 { call_site: 18446744072102523378 } hitcount: 485 bytes_req: 27160 { call_site: 18446744072099568646 } hitcount: 1676 bytes_req: 33520 Totals: Hits: 4610 Entries: 45 Dropped: 0 The output displays a line for each entry, beginning with the key specified in the trigger, followed by the value(s) also specified in the trigger. At the beginning of the output is a line that displays the trigger info, which can also be displayed by reading the 'trigger' file: # cat /sys/kernel/debug/tracing/events/kmem/kmalloc/trigger hist:keys=call_site:vals=bytes_req:sort=hitcount:size=2048 [active] At the end of the output are a few lines that display the overall totals for the run. The 'Hits' field shows the total number of times the event trigger was hit, the 'Entries' field shows the total number of used entries in the hash table, and the 'Dropped' field shows the number of hits that were dropped because the number of used entries for the run exceeded the maximum number of entries allowed for the table (normally 0, but if not a hint that you may want to increase the size of the table using the 'size' parameter). Notice in the above output that there's an extra field, 'hitcount', which wasn't specified in the trigger. Also notice that in the trigger info output, there's a parameter, 'sort=hitcount', which wasn't specified in the trigger either. The reason for that is that every trigger implicitly keeps a count of the total number of hits attributed to a given entry, called the 'hitcount'. That hitcount information is explicitly displayed in the output, and in the absence of a user-specified sort parameter, is used as the default sort field. The value 'hitcount' can be used in place of an explicit value in the 'values' parameter if you don't really need to have any particular field summed and are mainly interested in hit frequencies. To turn the hist trigger off, simply call up the trigger in the command history and re-execute it with a '!' prepended: # echo '!hist:key=call_site:val=bytes_req' > \ /sys/kernel/debug/tracing/events/kmem/kmalloc/trigger Finally, notice that the call_site as displayed in the output above isn't really very useful. It's an address, but normally addresses are displayed in hex. To have a numeric field displayed as a hex value, simply append '.hex' to the field name in the trigger: # echo 'hist:key=call_site.hex:val=bytes_req' > \ /sys/kernel/debug/tracing/events/kmem/kmalloc/trigger # cat /sys/kernel/debug/tracing/events/kmem/kmalloc/hist # trigger info: hist:keys=call_site.hex:vals=bytes_req:sort=hitcount:size=2048 [active] { call_site: ffffffffa026b291 } hitcount: 1 bytes_req: 433 { call_site: ffffffffa07186ff } hitcount: 1 bytes_req: 176 { call_site: ffffffff811ae721 } hitcount: 1 bytes_req: 16384 { call_site: ffffffff811c5134 } hitcount:
07-24
import pickle import numpy as np import pandas as pd from utils_model import combine_measure_waves, combine_global_features, combine_intermediate_features, performance_analysis from utils_model import Attention, model_lstm, model_minimalRNN, model_temporalCNN from tensorflow.keras.callbacks import * from tensorflow.keras import Model from tensorflow.keras import backend as K import xgboost as xgb from sklearn.ensemble import RandomForestClassifier, VotingClassifier from sklearn.neural_network import MLPClassifier from sklearn.neighbors import KNeighborsClassifier from sklearn.svm import SVC from sklearn.naive_bayes import GaussianNB import lightgbm as lgb import warnings from tqdm import tqdm # ============================================================================= # ################# NN Training & Extract intermediate features ############### # ============================================================================= def Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level = 'measurement', NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=False, ckpt_name='results_200chunks_weighted_bce/_LSTM_2Dense_layers_signal_level_iter0_fold0.h5', predict=True, intermediate_features=False, layer_idx=5, batch_size=512, monitor='val_loss', dropout=0.4, regularizer='l2', loss_name='weighted_bce', from_logits=True, weights_dict=None, kernel_size=[12,7], n_epochs=100, extract_attention_weights=False): ### train_indices, val_indices, test_indices are all measurement-level, so need to be adjusted first if indice_level == 'measurement': if NN_level == 'signal': train_indices = np.where(meta_df['id_measurement'].isin(train_indices))[0] val_indices = np.where(meta_df['id_measurement'].isin(val_indices))[0] test_indices = np.where(meta_df['id_measurement'].isin(test_indices))[0] elif NN_level == 'measurement': signal_waves = combine_measure_waves(signal_waves) signal_y = (meta_df.groupby('id_measurement')['target'].sum().round(0).astype(np.int)!=0).astype(np.float) else: # if train_indices, val_indices, test_indices are all signal-level, # NN_level must be signal, then no need to process indices and waveforms assert NN_level == 'signal' train_X, train_y = signal_waves[train_indices], signal_y[train_indices] val_X, val_y = signal_waves[val_indices], signal_y[val_indices] test_X, test_y = signal_waves[test_indices], signal_y[test_indices] if loss_name == 'weighted_bce' and weights_dict is None: weights = len(train_y) / (np.bincount(train_y) * 2) weights_dict = {0: weights[0], 1:weights[1]} if NN_model == 'LSTM': model = model_lstm(signal_waves.shape, Dense_layers, dropout=dropout, regularizer=regularizer, loss_name=loss_name, weights=weights_dict, from_logits=from_logits) elif NN_model == 'minimal_rnn': model = model_minimalRNN(signal_waves.shape, Dense_layers, dropout=dropout, regularizer=regularizer, loss_name=loss_name, weights=weights_dict, from_logits=from_logits) elif NN_model == 'TCN': signal_waves = signal_waves[..., np.newaxis] model = model_temporalCNN(signal_waves.shape, kernel_size=kernel_size, loss_name=loss_name, weights=weights_dict) if not NN_pretrained: ckpt = ModelCheckpoint(ckpt_name, save_best_only=True, save_weights_only=True, verbose=2, monitor=monitor, mode='min') model.fit(train_X, train_y, batch_size=batch_size, epochs=n_epochs, validation_data=(val_X, val_y), \ callbacks=[ckpt], verbose=2) model.load_weights(ckpt_name) if predict: yp = model.predict(train_X, batch_size=512) yp_val_fold = model.predict(val_X, batch_size=512) yp_test_fold = model.predict(test_X, batch_size=512) else: yp, yp_val_fold, yp_test_fold = None, None, None if intermediate_features: inter_model = Model(inputs = model.input, outputs = model.get_layer(index=layer_idx).output) inter_features = inter_model.predict(signal_waves) else: inter_features = None if extract_attention_weights: inter_model2 = Model(inputs = model.input, outputs = model.get_layer(index=2).output) inter_features_train = inter_model2.predict(train_X) weight = Attention(signal_waves.shape[1])(inter_features_train)[1] # (60%*N, nchunks, 1) else: weight = None return yp, yp_val_fold, yp_test_fold, inter_features, weight def whole_Network_training(meta_df, signal_waves, NN_level='signal', NN_model='LSTM', nchunks=200, Dense_layers=2, NN_pretrained=False, layer_idx = 5, NN_batch_size = 512, indice_level='measurement', output_folder='results_200chunks_weighted_bce', kfold_random_state=123948, num_folds=5, num_iterations=25, predict=True, monitor='val_loss', weights_dict=None, dropout=0.4, regularizer='l2', loss_name='bce', from_logits=True, kernel_size=[12,7], n_epochs=100, extract_attention_weights=False): signal_y = meta_df['target'].values measure_y = (meta_df.groupby('id_measurement')['target'].sum().round(0).astype(np.int)!= 0).astype(np.float) if predict: if NN_level == 'signal': yp_train = np.zeros(signal_y.shape[0]) yp_val = np.zeros(signal_y.shape[0]) yp_test = np.zeros(signal_y.shape[0]) elif NN_level == 'measurement': yp_train = np.zeros(measure_y.shape[0]) yp_val = np.zeros(measure_y.shape[0]) yp_test = np.zeros(measure_y.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None if extract_attention_weights: attention_weights = np.zeros([signal_y.shape[0], nchunks, 1]) else: attention_weights = None for iter in tqdm(range(num_iterations)): ##### split the dataset np.random.seed(kfold_random_state + iter) splits = np.zeros(measure_y.shape[0], dtype=np.int) m = measure_y == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = measure_y == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) # for fold in tqdm(range(num_folds)): for fold in range(num_folds): # print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] K.clear_session() # print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, iter, fold) yp, yp_val_fold, yp_test_fold, _, weight = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=indice_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=predict, intermediate_features=False, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, kernel_size=kernel_size, n_epochs=n_epochs, weights_dict=weights_dict, from_logits=from_logits, extract_attention_weights=extract_attention_weights) if NN_level == 'signal': train_indices = np.where(meta_df['id_measurement'].isin(train_indices))[0] val_indices = np.where(meta_df['id_measurement'].isin(val_indices))[0] test_indices = np.where(meta_df['id_measurement'].isin(test_indices))[0] if predict: yp_train[train_indices] += yp[:,0] yp_val[val_indices] += yp_val_fold[:,0] yp_test[test_indices] += yp_test_fold[:,0] if extract_attention_weights: attention_weights[train_indices, :, :] += weight if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations if from_logits: yp_train = 1 / (1 + np.exp(-yp_train)) yp_val = 1 / (1 + np.exp(-yp_val)) yp_test = 1 / (1 + np.exp(-yp_test)) best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=NN_level) if extract_attention_weights: attention_weights /= ((num_folds - 2) * num_iterations) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred, attention_weights # ============================================================================= # ########################### Define Classifier & Training #################### # ============================================================================= # In[] define the classifier & training def training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier='LightGBM', verbose_eval=0, predict=True, pretrained=False, early_stopping_rounds=100, model_file_name='LightGBM_measurement_level_global_features_iter0_fold0.dat', units=(64,32)): train_X, train_y = X_data.values[train_indices], y_data[train_indices] val_X, val_y = X_data.values[val_indices], y_data[val_indices] test_X, test_y = X_data.values[test_indices], y_data[test_indices] if not pretrained: if classifier == 'random_forest': class_weight = dict({0:0.5, 1:2.0}) model = RandomForestClassifier(bootstrap=True, class_weight=class_weight, criterion='gini', max_depth=8, max_features='auto', max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, min_samples_leaf=4, min_samples_split=10, min_weight_fraction_leaf=0.0, n_estimators=300, n_jobs=-1, oob_score=False, random_state=23948, verbose=verbose_eval, warm_start=False) model.fit(train_X, train_y) elif classifier == 'XGboost': trn = xgb.DMatrix(train_X, label = train_y, feature_names=feature_names) val = xgb.DMatrix(val_X, label = val_y, feature_names=feature_names) test = xgb.DMatrix(test_X, label = test_y, feature_names=feature_names) params = {'objective':'binary:logistic', 'nthread':4, 'eval_metric': 'logloss'} evallist = [(trn, 'train'), (val, 'validation'), (test, 'test')] model = xgb.train(params, trn, num_boost_round=10000, evals=evallist, verbose_eval=verbose_eval, early_stopping_rounds=early_stopping_rounds) elif classifier == 'LightGBM': params = {'objective': 'binary', 'boosting': 'gbdt', 'learning_rate': 0.01, 'num_leaves': 80, 'num_threads': 4, 'metric': 'binary_logloss', 'feature_fraction': 0.8, 'bagging_freq': 1, 'bagging_fraction': 0.8, 'seed': 23974, 'num_boost_round': 10000 } trn = lgb.Dataset(train_X, train_y, feature_name=feature_names) val = lgb.Dataset(val_X, val_y, feature_name=feature_names) test = lgb.Dataset(test_X, test_y, feature_name=feature_names) # train model with warnings.catch_warnings(): warnings.simplefilter("ignore") model = lgb.train(params, trn, valid_sets=(trn, test, val), valid_names=("train", "test", "validation"), fobj=None, feval=None, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval) elif classifier == 'MLP': model = MLPClassifier(hidden_layer_sizes=units, random_state=1, verbose=verbose_eval) model.fit(train_X, train_y) elif classifier == 'Voting': # clf1 = LogisticRegression(random_state=1, max_iter=2000) clf2 = KNeighborsClassifier(n_neighbors=6) clf3 = GaussianNB() clf4 = SVC(kernel='rbf', probability=True) # clf5 = DecisionTreeClassifier(max_depth=8) # model = VotingClassifier(estimators=[('lr', clf1), ('knn', clf2), ('gnb', clf3), # ('svc', clf4), ('dt', clf5)], voting='soft') model = VotingClassifier(estimators=[('knn', clf2), ('gnb', clf3), ('svc', clf4)], voting='soft') model = model.fit(train_X, train_y) pickle.dump(model, open(model_file_name, 'wb')) else: model = pickle.load(open(model_file_name, 'rb')) if predict: if classifier == 'random_forest' or classifier == 'MLP' or classifier == 'Voting': yp = model.predict_proba(train_X)[:,1] yp_val_fold = model.predict_proba(val_X)[:,1] yp_test_fold = model.predict_proba(test_X)[:,1] elif classifier == 'XGboost': yp = model.predict(xgb.DMatrix(train_X, feature_names=feature_names)) yp_val_fold = model.predict(xgb.DMatrix(val_X, feature_names=feature_names)) yp_test_fold = model.predict(xgb.DMatrix(test_X, feature_names=feature_names)) elif classifier == 'LightGBM': yp = model.predict(train_X) yp_val_fold = model.predict(val_X) yp_test_fold = model.predict(test_X) else: yp, yp_val_fold, yp_test_fold = None, None, None return model, yp, yp_val_fold, yp_test_fold # ============================================================================= # ######################## Whole Framework & Training ######################### # ============================================================================= def whole_process_training_single_iter(meta_df, signal_waves, global_features, local_features=True, NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=True, layer_idx = 5, NN_batch_size = 512, output_folder='results_200chunks_weighted_bce', classifier='XGboost', classifier_level='measurement', feature_set = 'global', kfold_random_state=123948, iter=0, early_stopping_rounds=100, num_folds=5, num_iterations=1, verbose_eval=0, load_local_features=True, predict=True, pretrained=False, monitor='val_loss', dropout=0.4, regularizer='l2', weights_dict=None, loss_name='weighted_bce', from_logits=True, kernel_size=[12,7], n_epochs=100, units=(128,64,32)): global_features, y_data, feature_names = combine_global_features(meta_df, global_features, classifier_level=classifier_level, feature_set=feature_set) if local_features: signal_y = meta_df['target'].values if load_local_features: all_local_features = pickle.load(open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx),'rb')) if predict: yp_train = np.zeros(global_features.shape[0]) yp_val = np.zeros(global_features.shape[0]) yp_test = np.zeros(global_features.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None # models = [] np.random.seed(kfold_random_state + iter) splits = np.zeros(global_features.shape[0], dtype=np.int) m = y_data == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = y_data == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) for fold in tqdm(range(num_folds)): print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] if local_features: if load_local_features: inter_features = all_local_features[5*iter + fold] else: K.clear_session() print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ loss_name, NN_model, Dense_layers, NN_level, monitor, iter, fold) _, _, _, inter_features, _ = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=classifier_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=False, intermediate_features=True, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, from_logits=from_logits, weights_dict=weights_dict, kernel_size=kernel_size, n_epochs=n_epochs) num_feature = inter_features.shape[1] #### Combine intermediate features & global features #### X_data, feature_names = combine_intermediate_features(global_features, \ inter_features, NN_level=NN_level, classifier_level=classifier_level) else: X_data = global_features ############# Input final features to the classifier ################### if not local_features: model_file_name = '{}/models/global_scale/{}_{}_level_{}_features_iter{}_fold{}.dat'.format(output_folder, \ classifier, classifier_level, feature_set, iter, fold) else: model_file_name = '{}/models/{}_{}Dense_layers_{}_level_monitor_{}_{}interfeatures_{}_{}_level_iter{}_fold{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, num_feature, classifier, classifier_level, iter, fold) print("Classifier Training: ") model, yp, yp_val_fold, yp_test_fold = training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier=classifier, predict=predict, pretrained=pretrained, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval, model_file_name=model_file_name, units=units) # models.append(model) if predict: yp_train[train_indices] += yp yp_val[val_indices] += yp_val_fold yp_test[test_indices] += yp_test_fold if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=classifier_level) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred def whole_process_training(meta_df, signal_waves, global_features, local_features=True, NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=True, layer_idx = 5, NN_batch_size = 512, output_folder='results_200chunks_weighted_bce', classifier='XGboost', classifier_level='measurement', feature_set = 'global', kfold_random_state=123948, early_stopping_rounds=100, num_folds=5, num_iterations=25, verbose_eval=0, load_local_features=True, predict=True, pretrained=True, monitor='val_loss', dropout=0.4, weights_dict=None, regularizer='l2', loss_name='weighted_bce', from_logits=True, kernel_size=[12,7], n_epochs=100, units=(64,32)): global_features, y_data, feature_names = combine_global_features(meta_df, global_features, classifier_level=classifier_level, feature_set=feature_set) if local_features: signal_y = meta_df['target'].values if load_local_features: all_local_features = pickle.load(open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx),'rb')) else: all_local_features = [] # all_local_features = np.zeros((global_features.shape[0], num_folds, num_iterations)) if predict: yp_train = np.zeros(global_features.shape[0]) yp_val = np.zeros(global_features.shape[0]) yp_test = np.zeros(global_features.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None # models = [] for iter in tqdm(range(num_iterations)): ##### split the dataset np.random.seed(kfold_random_state + iter) splits = np.zeros(global_features.shape[0], dtype=np.int) m = y_data == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = y_data == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) # for fold in tqdm(range(num_folds)): for fold in range(num_folds): # print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] if local_features: if load_local_features: inter_features = all_local_features[5*iter + fold] else: K.clear_session() # print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, iter, fold) _, _, _, inter_features, _ = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=classifier_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=False, intermediate_features=True, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, from_logits=from_logits, weights_dict=weights_dict, kernel_size=kernel_size, n_epochs=n_epochs) # num_feature = inter_features.shape[1] all_local_features.append(inter_features) #### Combine intermediate features & global features #### X_data, feature_names = combine_intermediate_features(global_features, \ inter_features, NN_level=NN_level, classifier_level=classifier_level) num_feature = inter_features.shape[1] else: X_data = global_features ############# Input final features to the classifier ################### if not local_features: model_file_name = '{}/models/global_scale/{}_{}_level_{}_features_iter{}_fold{}.dat'.format(output_folder, \ classifier, classifier_level, feature_set, iter, fold) else: model_file_name = '{}/models/{}_{}Dense_layers_{}_level_monitor_{}_{}interfeatures_{}_{}_level_iter{}_fold{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, num_feature, classifier, classifier_level, iter, fold) # print("Classifier Training: ") model, yp, yp_val_fold, yp_test_fold = training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier=classifier, predict=predict, pretrained=pretrained, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval, model_file_name=model_file_name, units=units) # models.append(model) if predict: yp_train[train_indices] += yp yp_val[val_indices] += yp_val_fold yp_test[test_indices] += yp_test_fold if local_features: if not load_local_features: all_local_features = np.array(all_local_features) pickle.dump(all_local_features, open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx), 'wb')) if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=classifier_level) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred这段程序的输出变量是什么 复制 这段程序的主要输出变量是以下内容: 1. [yp_train, yp_val, yp_test] 含义:这是模型的预测结果,分别对应训练集、验证集和测试集的预测概率。 数据类型:numpy数组。 用途:用于后续的性能分析和评估模型的预测能力。 2. best_proba 含义:最佳阈值下的预测概率。在性能分析中,程序会根据某些指标(如F1分数、AUC等)选择最佳的阈值,并返回对应的预测概率。 数据类型:numpy数组。 用途:用于评估模型在最佳阈值下的性能。 3. metrics 含义:模型的性能指标,通常包括准确率(Accuracy)、召回率(Recall)、F1分数(F1 Score)、AUC(Area Under the ROC Curve)等。 数据类型:字典或列表,具体取决于performance_analysis函数的实现。 用途:用于评估模型的整体性能,帮助比较不同模型或不同超参数设置的效果。 4. test_pred 含义:测试集的最终预测结果。这通常是二分类问题中的0或1预测值。 数据类型:numpy数组。 用途:用于评估模型在测试集上的实际分类效果。 5. attention_weights(如果extract_attention_weights=True) 含义:如果在神经网络训练中启用了注意力机制(extract_attention_weights=True),则会输出每个时间步的注意力权重。 数据类型:numpy数组,形状为(样本数, 时间步数, 1)。 用途:用于分析模型对输入数据中不同部分的关注程度,有助于解释模型的行为。 6. all_local_features(如果local_features=True且load_local_features=False) 含义:如果启用了局部特征提取(local_features=True),并且没有加载预训练的局部特征(load_local_features=False),则会保存提取的局部特征。 数据类型:numpy数组或pickle文件。 用途:用于后续的模型训练或分析。 7. model 含义:训练好的分类器模型。 数据类型:根据分类器类型不同,可能是sklearn模型、XGBoost模型或LightGBM模型等。 用途:用于保存模型,以便后续使用(如预测新数据或进一步分析)。 8. model_file_name 含义:保存模型的文件路径。 数据类型:字符串。 用途:用于存储训练好的模型,方便后续加载和使用。 总结 这段程序的主要输出是模型的预测结果(yp_train、yp_val、yp_test)、性能指标(metrics)、最佳预测概率(best_proba)和测试集预测结果(test_pred)。此外,根据配置,还可能输出注意力权重(attention_weights)和局部特征(all_local_features)。这些输出变量可用于评估模型性能、解释模型行为或保存模型以供后续使用。 复制 分享 import pickle import numpy as np import pandas as pd from utils_model import combine_measure_waves, combine_global_features, combine_intermediate_features, performance_analysis from utils_model import Attention, model_lstm, model_minimalRNN, model_temporalCNN from tensorflow.keras.callbacks import * from tensorflow.keras import Model from tensorflow.keras import backend as K import xgboost as xgb from sklearn.ensemble import RandomForestClassifier, VotingClassifier from sklearn.neural_network import MLPClassifier from sklearn.neighbors import KNeighborsClassifier from sklearn.svm import SVC from sklearn.naive_bayes import GaussianNB import lightgbm as lgb import warnings from tqdm import tqdm # ============================================================================= # ################# NN Training & Extract intermediate features ############### # ============================================================================= def Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level = 'measurement', NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=False, ckpt_name='results_200chunks_weighted_bce/_LSTM_2Dense_layers_signal_level_iter0_fold0.h5', predict=True, intermediate_features=False, layer_idx=5, batch_size=512, monitor='val_loss', dropout=0.4, regularizer='l2', loss_name='weighted_bce', from_logits=True, weights_dict=None, kernel_size=[12,7], n_epochs=100, extract_attention_weights=False): ### train_indices, val_indices, test_indices are all measurement-level, so need to be adjusted first if indice_level == 'measurement': if NN_level == 'signal': train_indices = np.where(meta_df['id_measurement'].isin(train_indices))[0] val_indices = np.where(meta_df['id_measurement'].isin(val_indices))[0] test_indices = np.where(meta_df['id_measurement'].isin(test_indices))[0] elif NN_level == 'measurement': signal_waves = combine_measure_waves(signal_waves) signal_y = (meta_df.groupby('id_measurement')['target'].sum().round(0).astype(np.int)!=0).astype(np.float) else: # if train_indices, val_indices, test_indices are all signal-level, # NN_level must be signal, then no need to process indices and waveforms assert NN_level == 'signal' train_X, train_y = signal_waves[train_indices], signal_y[train_indices] val_X, val_y = signal_waves[val_indices], signal_y[val_indices] test_X, test_y = signal_waves[test_indices], signal_y[test_indices] if loss_name == 'weighted_bce' and weights_dict is None: weights = len(train_y) / (np.bincount(train_y) * 2) weights_dict = {0: weights[0], 1:weights[1]} if NN_model == 'LSTM': model = model_lstm(signal_waves.shape, Dense_layers, dropout=dropout, regularizer=regularizer, loss_name=loss_name, weights=weights_dict, from_logits=from_logits) elif NN_model == 'minimal_rnn': model = model_minimalRNN(signal_waves.shape, Dense_layers, dropout=dropout, regularizer=regularizer, loss_name=loss_name, weights=weights_dict, from_logits=from_logits) elif NN_model == 'TCN': signal_waves = signal_waves[..., np.newaxis] model = model_temporalCNN(signal_waves.shape, kernel_size=kernel_size, loss_name=loss_name, weights=weights_dict) if not NN_pretrained: ckpt = ModelCheckpoint(ckpt_name, save_best_only=True, save_weights_only=True, verbose=2, monitor=monitor, mode='min') model.fit(train_X, train_y, batch_size=batch_size, epochs=n_epochs, validation_data=(val_X, val_y), \ callbacks=[ckpt], verbose=2) model.load_weights(ckpt_name) if predict: yp = model.predict(train_X, batch_size=512) yp_val_fold = model.predict(val_X, batch_size=512) yp_test_fold = model.predict(test_X, batch_size=512) else: yp, yp_val_fold, yp_test_fold = None, None, None if intermediate_features: inter_model = Model(inputs = model.input, outputs = model.get_layer(index=layer_idx).output) inter_features = inter_model.predict(signal_waves) else: inter_features = None if extract_attention_weights: inter_model2 = Model(inputs = model.input, outputs = model.get_layer(index=2).output) inter_features_train = inter_model2.predict(train_X) weight = Attention(signal_waves.shape[1])(inter_features_train)[1] # (60%*N, nchunks, 1) else: weight = None return yp, yp_val_fold, yp_test_fold, inter_features, weight def whole_Network_training(meta_df, signal_waves, NN_level='signal', NN_model='LSTM', nchunks=200, Dense_layers=2, NN_pretrained=False, layer_idx = 5, NN_batch_size = 512, indice_level='measurement', output_folder='results_200chunks_weighted_bce', kfold_random_state=123948, num_folds=5, num_iterations=25, predict=True, monitor='val_loss', weights_dict=None, dropout=0.4, regularizer='l2', loss_name='bce', from_logits=True, kernel_size=[12,7], n_epochs=100, extract_attention_weights=False): signal_y = meta_df['target'].values measure_y = (meta_df.groupby('id_measurement')['target'].sum().round(0).astype(np.int)!= 0).astype(np.float) if predict: if NN_level == 'signal': yp_train = np.zeros(signal_y.shape[0]) yp_val = np.zeros(signal_y.shape[0]) yp_test = np.zeros(signal_y.shape[0]) elif NN_level == 'measurement': yp_train = np.zeros(measure_y.shape[0]) yp_val = np.zeros(measure_y.shape[0]) yp_test = np.zeros(measure_y.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None if extract_attention_weights: attention_weights = np.zeros([signal_y.shape[0], nchunks, 1]) else: attention_weights = None for iter in tqdm(range(num_iterations)): ##### split the dataset np.random.seed(kfold_random_state + iter) splits = np.zeros(measure_y.shape[0], dtype=np.int) m = measure_y == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = measure_y == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) # for fold in tqdm(range(num_folds)): for fold in range(num_folds): # print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] K.clear_session() # print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, iter, fold) yp, yp_val_fold, yp_test_fold, _, weight = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=indice_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=predict, intermediate_features=False, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, kernel_size=kernel_size, n_epochs=n_epochs, weights_dict=weights_dict, from_logits=from_logits, extract_attention_weights=extract_attention_weights) if NN_level == 'signal': train_indices = np.where(meta_df['id_measurement'].isin(train_indices))[0] val_indices = np.where(meta_df['id_measurement'].isin(val_indices))[0] test_indices = np.where(meta_df['id_measurement'].isin(test_indices))[0] if predict: yp_train[train_indices] += yp[:,0] yp_val[val_indices] += yp_val_fold[:,0] yp_test[test_indices] += yp_test_fold[:,0] if extract_attention_weights: attention_weights[train_indices, :, :] += weight if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations if from_logits: yp_train = 1 / (1 + np.exp(-yp_train)) yp_val = 1 / (1 + np.exp(-yp_val)) yp_test = 1 / (1 + np.exp(-yp_test)) best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=NN_level) if extract_attention_weights: attention_weights /= ((num_folds - 2) * num_iterations) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred, attention_weights # ============================================================================= # ########################### Define Classifier & Training #################### # ============================================================================= # In[] define the classifier & training def training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier='LightGBM', verbose_eval=0, predict=True, pretrained=False, early_stopping_rounds=100, model_file_name='LightGBM_measurement_level_global_features_iter0_fold0.dat', units=(64,32)): train_X, train_y = X_data.values[train_indices], y_data[train_indices] val_X, val_y = X_data.values[val_indices], y_data[val_indices] test_X, test_y = X_data.values[test_indices], y_data[test_indices] if not pretrained: if classifier == 'random_forest': class_weight = dict({0:0.5, 1:2.0}) model = RandomForestClassifier(bootstrap=True, class_weight=class_weight, criterion='gini', max_depth=8, max_features='auto', max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, min_samples_leaf=4, min_samples_split=10, min_weight_fraction_leaf=0.0, n_estimators=300, n_jobs=-1, oob_score=False, random_state=23948, verbose=verbose_eval, warm_start=False) model.fit(train_X, train_y) elif classifier == 'XGboost': trn = xgb.DMatrix(train_X, label = train_y, feature_names=feature_names) val = xgb.DMatrix(val_X, label = val_y, feature_names=feature_names) test = xgb.DMatrix(test_X, label = test_y, feature_names=feature_names) params = {'objective':'binary:logistic', 'nthread':4, 'eval_metric': 'logloss'} evallist = [(trn, 'train'), (val, 'validation'), (test, 'test')] model = xgb.train(params, trn, num_boost_round=10000, evals=evallist, verbose_eval=verbose_eval, early_stopping_rounds=early_stopping_rounds) elif classifier == 'LightGBM': params = {'objective': 'binary', 'boosting': 'gbdt', 'learning_rate': 0.01, 'num_leaves': 80, 'num_threads': 4, 'metric': 'binary_logloss', 'feature_fraction': 0.8, 'bagging_freq': 1, 'bagging_fraction': 0.8, 'seed': 23974, 'num_boost_round': 10000 } trn = lgb.Dataset(train_X, train_y, feature_name=feature_names) val = lgb.Dataset(val_X, val_y, feature_name=feature_names) test = lgb.Dataset(test_X, test_y, feature_name=feature_names) # train model with warnings.catch_warnings(): warnings.simplefilter("ignore") model = lgb.train(params, trn, valid_sets=(trn, test, val), valid_names=("train", "test", "validation"), fobj=None, feval=None, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval) elif classifier == 'MLP': model = MLPClassifier(hidden_layer_sizes=units, random_state=1, verbose=verbose_eval) model.fit(train_X, train_y) elif classifier == 'Voting': # clf1 = LogisticRegression(random_state=1, max_iter=2000) clf2 = KNeighborsClassifier(n_neighbors=6) clf3 = GaussianNB() clf4 = SVC(kernel='rbf', probability=True) # clf5 = DecisionTreeClassifier(max_depth=8) # model = VotingClassifier(estimators=[('lr', clf1), ('knn', clf2), ('gnb', clf3), # ('svc', clf4), ('dt', clf5)], voting='soft') model = VotingClassifier(estimators=[('knn', clf2), ('gnb', clf3), ('svc', clf4)], voting='soft') model = model.fit(train_X, train_y) pickle.dump(model, open(model_file_name, 'wb')) else: model = pickle.load(open(model_file_name, 'rb')) if predict: if classifier == 'random_forest' or classifier == 'MLP' or classifier == 'Voting': yp = model.predict_proba(train_X)[:,1] yp_val_fold = model.predict_proba(val_X)[:,1] yp_test_fold = model.predict_proba(test_X)[:,1] elif classifier == 'XGboost': yp = model.predict(xgb.DMatrix(train_X, feature_names=feature_names)) yp_val_fold = model.predict(xgb.DMatrix(val_X, feature_names=feature_names)) yp_test_fold = model.predict(xgb.DMatrix(test_X, feature_names=feature_names)) elif classifier == 'LightGBM': yp = model.predict(train_X) yp_val_fold = model.predict(val_X) yp_test_fold = model.predict(test_X) else: yp, yp_val_fold, yp_test_fold = None, None, None return model, yp, yp_val_fold, yp_test_fold # ============================================================================= # ######################## Whole Framework & Training ######################### # ============================================================================= def whole_process_training_single_iter(meta_df, signal_waves, global_features, local_features=True, NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=True, layer_idx = 5, NN_batch_size = 512, output_folder='results_200chunks_weighted_bce', classifier='XGboost', classifier_level='measurement', feature_set = 'global', kfold_random_state=123948, iter=0, early_stopping_rounds=100, num_folds=5, num_iterations=1, verbose_eval=0, load_local_features=True, predict=True, pretrained=False, monitor='val_loss', dropout=0.4, regularizer='l2', weights_dict=None, loss_name='weighted_bce', from_logits=True, kernel_size=[12,7], n_epochs=100, units=(128,64,32)): global_features, y_data, feature_names = combine_global_features(meta_df, global_features, classifier_level=classifier_level, feature_set=feature_set) if local_features: signal_y = meta_df['target'].values if load_local_features: all_local_features = pickle.load(open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx),'rb')) if predict: yp_train = np.zeros(global_features.shape[0]) yp_val = np.zeros(global_features.shape[0]) yp_test = np.zeros(global_features.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None # models = [] np.random.seed(kfold_random_state + iter) splits = np.zeros(global_features.shape[0], dtype=np.int) m = y_data == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = y_data == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) for fold in tqdm(range(num_folds)): print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] if local_features: if load_local_features: inter_features = all_local_features[5*iter + fold] else: K.clear_session() print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ loss_name, NN_model, Dense_layers, NN_level, monitor, iter, fold) _, _, _, inter_features, _ = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=classifier_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=False, intermediate_features=True, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, from_logits=from_logits, weights_dict=weights_dict, kernel_size=kernel_size, n_epochs=n_epochs) num_feature = inter_features.shape[1] #### Combine intermediate features & global features #### X_data, feature_names = combine_intermediate_features(global_features, \ inter_features, NN_level=NN_level, classifier_level=classifier_level) else: X_data = global_features ############# Input final features to the classifier ################### if not local_features: model_file_name = '{}/models/global_scale/{}_{}_level_{}_features_iter{}_fold{}.dat'.format(output_folder, \ classifier, classifier_level, feature_set, iter, fold) else: model_file_name = '{}/models/{}_{}Dense_layers_{}_level_monitor_{}_{}interfeatures_{}_{}_level_iter{}_fold{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, num_feature, classifier, classifier_level, iter, fold) print("Classifier Training: ") model, yp, yp_val_fold, yp_test_fold = training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier=classifier, predict=predict, pretrained=pretrained, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval, model_file_name=model_file_name, units=units) # models.append(model) if predict: yp_train[train_indices] += yp yp_val[val_indices] += yp_val_fold yp_test[test_indices] += yp_test_fold if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=classifier_level) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred def whole_process_training(meta_df, signal_waves, global_features, local_features=True, NN_level='signal', NN_model='LSTM', Dense_layers=2, NN_pretrained=True, layer_idx = 5, NN_batch_size = 512, output_folder='results_200chunks_weighted_bce', classifier='XGboost', classifier_level='measurement', feature_set = 'global', kfold_random_state=123948, early_stopping_rounds=100, num_folds=5, num_iterations=25, verbose_eval=0, load_local_features=True, predict=True, pretrained=True, monitor='val_loss', dropout=0.4, weights_dict=None, regularizer='l2', loss_name='weighted_bce', from_logits=True, kernel_size=[12,7], n_epochs=100, units=(64,32)): global_features, y_data, feature_names = combine_global_features(meta_df, global_features, classifier_level=classifier_level, feature_set=feature_set) if local_features: signal_y = meta_df['target'].values if load_local_features: all_local_features = pickle.load(open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx),'rb')) else: all_local_features = [] # all_local_features = np.zeros((global_features.shape[0], num_folds, num_iterations)) if predict: yp_train = np.zeros(global_features.shape[0]) yp_val = np.zeros(global_features.shape[0]) yp_test = np.zeros(global_features.shape[0]) else: yp_train = None yp_val = None yp_test = None best_proba, metrics, test_pred = None, None, None # models = [] for iter in tqdm(range(num_iterations)): ##### split the dataset np.random.seed(kfold_random_state + iter) splits = np.zeros(global_features.shape[0], dtype=np.int) m = y_data == 1 splits[m] = np.random.randint(0, 5, size=m.sum()) m = y_data == 0 splits[m] = np.random.randint(0, 5, size=m.sum()) # for fold in tqdm(range(num_folds)): for fold in range(num_folds): # print("Beginning iteration {}, fold {}".format(iter, fold)) val_fold = fold test_fold = (fold + 1) % num_folds train_folds = [f for f in range(num_folds) if f not in [val_fold, test_fold]] train_indices = np.where(np.isin(splits, train_folds))[0] val_indices = np.where(splits == val_fold)[0] test_indices = np.where(splits == test_fold)[0] if local_features: if load_local_features: inter_features = all_local_features[5*iter + fold] else: K.clear_session() # print("NN Training & Extracting intermediate features") ckpt_name = '{}/RNN_weights/{}_{}Dense_layers_{}_level_monitor_{}_iter{}_fold{}.h5'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, iter, fold) _, _, _, inter_features, _ = Network_Training_single_fold(meta_df, signal_waves, signal_y, train_indices, val_indices, test_indices, indice_level=classifier_level, NN_level=NN_level, NN_model=NN_model, Dense_layers=Dense_layers, NN_pretrained=NN_pretrained, ckpt_name=ckpt_name, predict=False, intermediate_features=True, layer_idx=layer_idx, batch_size=NN_batch_size, monitor=monitor, dropout=dropout, regularizer=regularizer, loss_name=loss_name, from_logits=from_logits, weights_dict=weights_dict, kernel_size=kernel_size, n_epochs=n_epochs) # num_feature = inter_features.shape[1] all_local_features.append(inter_features) #### Combine intermediate features & global features #### X_data, feature_names = combine_intermediate_features(global_features, \ inter_features, NN_level=NN_level, classifier_level=classifier_level) num_feature = inter_features.shape[1] else: X_data = global_features ############# Input final features to the classifier ################### if not local_features: model_file_name = '{}/models/global_scale/{}_{}_level_{}_features_iter{}_fold{}.dat'.format(output_folder, \ classifier, classifier_level, feature_set, iter, fold) else: model_file_name = '{}/models/{}_{}Dense_layers_{}_level_monitor_{}_{}interfeatures_{}_{}_level_iter{}_fold{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, monitor, num_feature, classifier, classifier_level, iter, fold) # print("Classifier Training: ") model, yp, yp_val_fold, yp_test_fold = training_classifier(X_data, y_data, feature_names, train_indices, val_indices, test_indices, classifier=classifier, predict=predict, pretrained=pretrained, early_stopping_rounds=early_stopping_rounds, verbose_eval=verbose_eval, model_file_name=model_file_name, units=units) # models.append(model) if predict: yp_train[train_indices] += yp yp_val[val_indices] += yp_val_fold yp_test[test_indices] += yp_test_fold if local_features: if not load_local_features: all_local_features = np.array(all_local_features) pickle.dump(all_local_features, open('{}/local_features_{}_{}Dense_layers_{}_level_layer_{}.dat'.format(output_folder, \ NN_model, Dense_layers, NN_level, layer_idx), 'wb')) if predict: yp_train /= ((num_folds - 2) * num_iterations) yp_val /= num_iterations yp_test /= num_iterations best_proba, metrics, test_pred = performance_analysis(meta_df, yp_train, yp_val, yp_test, predict_level=classifier_level) return [yp_train, yp_val, yp_test], best_proba, metrics, test_pred这段程序中哪里是数据最开始输入的地方
07-05
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值