Skip to content
  • Home
  • Aktuell
  • Tags
  • 0 Ungelesen 0
  • Kategorien
  • Unreplied
  • Beliebt
  • GitHub
  • Docu
  • Hilfe
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Standard: (Kein Skin)
  • Kein Skin
Einklappen
ioBroker Logo

Community Forum

donate donate
  1. ioBroker Community Home
  2. Deutsch
  3. Entwicklung
  4. [Neuer Adapter] Proxmox VM

NEWS

  • UPDATE 31.10.: Amazon Alexa - ioBroker Skill läuft aus ?
    apollon77A
    apollon77
    48
    3
    8.5k

  • Monatsrückblick – September 2025
    BluefoxB
    Bluefox
    13
    1
    2.0k

  • Neues Video "KI im Smart Home" - ioBroker plus n8n
    BluefoxB
    Bluefox
    15
    1
    2.6k

[Neuer Adapter] Proxmox VM

Geplant Angeheftet Gesperrt Verschoben Entwicklung
proxmox
486 Beiträge 75 Kommentatoren 112.8k Aufrufe 58 Watching
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • David G.D David G.

    @arteck sagte in [Neuer Adapter] Proxmox VM:

    @david-g der startet neu.. und läuft danach.. nur warum der neu startet ??
    stell den mal auf debug,, dann ab der Zeile /das ist der start)

    Using Proxmox API: https://192.168.99.58:8006/api2/json
    

    bis nach abbruch posten

    Die Zeile kommt ja im Log mehrmals vor. Sind also ein Paar Neustarts drinnen.
    Debug hatte ich scheinbar noch eingestellt, wie @dp20eic schon gesehen hat.
    Der Admin gibt mir keine Fehlermeldungen über einen regelmäßigen Absturz vom Adapter.
    Er ist im Moment auch grün und lauft trotzdem nicht.

    arteckA Offline
    arteckA Offline
    arteck
    Developer Most Active
    schrieb am zuletzt editiert von
    #319

    @david-g sagte in [Neuer Adapter] Proxmox VM:

    Er ist im Moment auch grün und lauft trotzdem nicht.

    und das erkennst du wodran ??

    @dp20eic ja in dem schnippsel.. aber nicht in dem gepostetem langen LOG

    zigbee hab ich, zwave auch, nuc's genauso und HA auch

    David G.D 1 Antwort Letzte Antwort
    0
    • arteckA arteck

      @david-g sagte in [Neuer Adapter] Proxmox VM:

      Er ist im Moment auch grün und lauft trotzdem nicht.

      und das erkennst du wodran ??

      @dp20eic ja in dem schnippsel.. aber nicht in dem gepostetem langen LOG

      David G.D Online
      David G.D Online
      David G.
      schrieb am zuletzt editiert von
      #320

      @arteck sagte in [Neuer Adapter] Proxmox VM:

      Er ist im Moment auch grün und lauft trotzdem nicht.

      Weil seit dem 12ten abends kein einziger DP aktualisiert wurde.

      @arteck sagte in [Neuer Adapter] Proxmox VM:

      aber nicht in dem gepostetem langen LOG

      Ich stelle nochmal auf Debug und starte den Adapter neu.
      Hatte ihn extra mal so gelassen.

      Zeigt eure Lovelace-Visualisierung klick
      (Auch ideal um sich Anregungen zu holen)

      Meine Tabellen für eure Visualisierung klick

      arteckA 1 Antwort Letzte Antwort
      0
      • David G.D David G.

        @arteck sagte in [Neuer Adapter] Proxmox VM:

        Er ist im Moment auch grün und lauft trotzdem nicht.

        Weil seit dem 12ten abends kein einziger DP aktualisiert wurde.

        @arteck sagte in [Neuer Adapter] Proxmox VM:

        aber nicht in dem gepostetem langen LOG

        Ich stelle nochmal auf Debug und starte den Adapter neu.
        Hatte ihn extra mal so gelassen.

        arteckA Offline
        arteckA Offline
        arteck
        Developer Most Active
        schrieb am zuletzt editiert von arteck
        #321

        @david-g welche genau .. die DP werden nur aktualisiert wenn sich was geändert hat..also genau bitte
        und diw ist der anfrage intervall eingestellt ?

        zigbee hab ich, zwave auch, nuc's genauso und HA auch

        David G.D 1 Antwort Letzte Antwort
        0
        • arteckA arteck

          @david-g welche genau .. die DP werden nur aktualisiert wenn sich was geändert hat..also genau bitte
          und diw ist der anfrage intervall eingestellt ?

          David G.D Online
          David G.D Online
          David G.
          schrieb am zuletzt editiert von David G.
          #322

          @arteck

          Anfangs hatte ich den Intervall auf 30sek.
          Hab es dann auf 5min abgeändert um zu schauen ob der Intervall ggf zu schnell ist.

          Kontrollieren tu ich es immer anhand der CPU Auslastung der iobroker VM, da sollte sich ja was tun...
          Außerdem sehe ich es an der Tabelle meiner Backups. Sie Triggert auf Änderung des DPs.
          Stand bis eben noch am 12.12., jetzt nach dem Neustart ist der 14.12 aktuell.

          Hab mir grad mal ein kleines Skript geschrieben, damit ich sehe wann der Adapter aussteigt.
          Abfrageintervall steht auf 2min.

          fbfba420-891b-4cf9-a606-70550d8af543-image.png

          Zeigt eure Lovelace-Visualisierung klick
          (Auch ideal um sich Anregungen zu holen)

          Meine Tabellen für eure Visualisierung klick

          lobomauL 1 Antwort Letzte Antwort
          0
          • David G.D David G.

            @arteck

            Anfangs hatte ich den Intervall auf 30sek.
            Hab es dann auf 5min abgeändert um zu schauen ob der Intervall ggf zu schnell ist.

            Kontrollieren tu ich es immer anhand der CPU Auslastung der iobroker VM, da sollte sich ja was tun...
            Außerdem sehe ich es an der Tabelle meiner Backups. Sie Triggert auf Änderung des DPs.
            Stand bis eben noch am 12.12., jetzt nach dem Neustart ist der 14.12 aktuell.

            Hab mir grad mal ein kleines Skript geschrieben, damit ich sehe wann der Adapter aussteigt.
            Abfrageintervall steht auf 2min.

            fbfba420-891b-4cf9-a606-70550d8af543-image.png

            lobomauL Offline
            lobomauL Offline
            lobomau
            schrieb am zuletzt editiert von
            #323

            @david-g ich habe das Problem auch schon paar Tage. Habe auch paar Downgrades des Proxmox Adapters ausprobiert. Ich baue erstmal einen Neustartzeitplan ein. Einmal täglich.

            Grad fiel mir auf, dass der Adapter fast 2 Tage nix geliefert hatte. Nach einem Adapterneustart läuft es dann wieder:

            a3fe6dc8-ea32-488b-807e-cebfa639ae6e-image.png

            Host: NUC8i3 mit Proxmox:

            • ioBroker CT Debian 13, npm 10.9.3, nodejs 22.20.0
            • Slave: Pi4
            David G.D 1 Antwort Letzte Antwort
            0
            • lobomauL lobomau

              @david-g ich habe das Problem auch schon paar Tage. Habe auch paar Downgrades des Proxmox Adapters ausprobiert. Ich baue erstmal einen Neustartzeitplan ein. Einmal täglich.

              Grad fiel mir auf, dass der Adapter fast 2 Tage nix geliefert hatte. Nach einem Adapterneustart läuft es dann wieder:

              a3fe6dc8-ea32-488b-807e-cebfa639ae6e-image.png

              David G.D Online
              David G.D Online
              David G.
              schrieb am zuletzt editiert von
              #324

              @lobomau
              Dann bin ich ja nicht alleine.

              Zeigt eure Lovelace-Visualisierung klick
              (Auch ideal um sich Anregungen zu holen)

              Meine Tabellen für eure Visualisierung klick

              NegaleinN 1 Antwort Letzte Antwort
              0
              • David G.D David G.

                @lobomau
                Dann bin ich ja nicht alleine.

                NegaleinN Offline
                NegaleinN Offline
                Negalein
                Global Moderator
                schrieb am zuletzt editiert von
                #325

                @david-g sagte in [Neuer Adapter] Proxmox VM:

                Dann bin ich ja nicht alleine.

                Nein, bei mir auch.
                Ist mir erst jetzt aufgefallen.

                ° Node.js: 20.17.0 NPM: 10.8.2
                ° Proxmox, Ubuntu 22.04.3 LTS
                ° Fixer ---> iob fix

                1 Antwort Letzte Antwort
                0
                • David G.D Online
                  David G.D Online
                  David G.
                  schrieb am zuletzt editiert von
                  #326

                  Ob das evtl mit Proxmox 8 zu tun hat?
                  Das Problem kann es ja noch nicht so lange geben.

                  Das 8er ist ja recht "neu".
                  Oder wurde am Adapter zuletzt auch viel überarbeitet?

                  Zeigt eure Lovelace-Visualisierung klick
                  (Auch ideal um sich Anregungen zu holen)

                  Meine Tabellen für eure Visualisierung klick

                  NegaleinN 1 Antwort Letzte Antwort
                  0
                  • David G.D David G.

                    Ob das evtl mit Proxmox 8 zu tun hat?
                    Das Problem kann es ja noch nicht so lange geben.

                    Das 8er ist ja recht "neu".
                    Oder wurde am Adapter zuletzt auch viel überarbeitet?

                    NegaleinN Offline
                    NegaleinN Offline
                    Negalein
                    Global Moderator
                    schrieb am zuletzt editiert von
                    #327

                    @david-g sagte in [Neuer Adapter] Proxmox VM:

                    Ob das evtl mit Proxmox 8 zu tun hat?

                    Nein, da bei mir noch 7.2-7 läuft.

                    ° Node.js: 20.17.0 NPM: 10.8.2
                    ° Proxmox, Ubuntu 22.04.3 LTS
                    ° Fixer ---> iob fix

                    1 Antwort Letzte Antwort
                    0
                    • David G.D Online
                      David G.D Online
                      David G.
                      schrieb am zuletzt editiert von David G.
                      #328

                      @arteck
                      Gestern Abend ist der Adapter wieder ausgestiegen.

                      Hier die letzte Aktualisierung
                      Screenshot_20231216_102320_Chrome.jpg

                      Das runtergadene Log ist (gepackt) 5 MB groß.
                      Darf ich es dir mailen? Ka ob ich überall alle Passwörter finde. Dann muss ich es nicht hier hochladen.

                      EDIT
                      War das zuletzt nicht eine ähnliche Uhrzeit wo der Adapter zuletzt gelaufen ist?

                      Zeigt eure Lovelace-Visualisierung klick
                      (Auch ideal um sich Anregungen zu holen)

                      Meine Tabellen für eure Visualisierung klick

                      ? 1 Antwort Letzte Antwort
                      0
                      • David G.D David G.

                        @arteck
                        Gestern Abend ist der Adapter wieder ausgestiegen.

                        Hier die letzte Aktualisierung
                        Screenshot_20231216_102320_Chrome.jpg

                        Das runtergadene Log ist (gepackt) 5 MB groß.
                        Darf ich es dir mailen? Ka ob ich überall alle Passwörter finde. Dann muss ich es nicht hier hochladen.

                        EDIT
                        War das zuletzt nicht eine ähnliche Uhrzeit wo der Adapter zuletzt gelaufen ist?

                        ? Offline
                        ? Offline
                        Ein ehemaliger Benutzer
                        schrieb am zuletzt editiert von
                        #329

                        @david-g sagte in [Neuer Adapter] Proxmox VM:

                        War das zuletzt nicht eine ähnliche Uhrzeit wo der Adapter zuletzt gelaufen is

                        Moin,

                        laut Deinem großen Log von gestern, war es 23:12 Uhr, wann machst Du denn ein Backup? Oder anders ausgedrückt, was läuft denn bei Dir um diese Uhrzeit?
                        Kannst Du mal in die Logs der Maschine schauen, also nicht ioBroker, sondern Linux, entweder

                        # sudo journalctl -g proxmox
                        # sudo journalctl -g error
                        

                        VG
                        Bernd

                        David G.D 1 Antwort Letzte Antwort
                        0
                        • arteckA arteck

                          @david-g poste mal den gesammten LOG..

                          Eduard77E Offline
                          Eduard77E Offline
                          Eduard77
                          schrieb am zuletzt editiert von Eduard77
                          #330

                          @arteck
                          bei mir steigt Adapter auch aus.
                          um 11:11 hab ich Adapter neu gestertet. Die DP werden aber nicht aktuallisiert.
                          f55d338c-2de2-486a-b979-d6852df1d170-image.png

                          anbei ist ein Teil meines Log.

                          2023-12-16 11:11:47.173 - debug: proxmox.0 (206924) sendRequest interval started
                          2023-12-16 11:11:47.186 - debug: proxmox.0 (206924) received 200 response from /nodes with content: {"data":[{"maxmem":16325120000,"disk":16431407104,"node":"pve","maxcpu":4,"type":"node","level":"","status":"online","id":"node/pve","maxdisk":68959993856,"mem":4057702400,"uptime":228588,"cpu":0.0461170848267622,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86"}]}
                          2023-12-16 11:11:47.186 - debug: proxmox.0 (206924) Nodes: [{"maxmem":16325120000,"disk":16431407104,"node":"pve","maxcpu":4,"type":"node","level":"","status":"online","id":"node/pve","maxdisk":68959993856,"mem":4057702400,"uptime":228588,"cpu":0.0461170848267622,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86"}]
                          2023-12-16 11:11:47.186 - debug: proxmox.0 (206924) Node: {"maxmem":16325120000,"disk":16431407104,"node":"pve","maxcpu":4,"type":"node","level":"","status":"online","id":"node/pve","maxdisk":68959993856,"mem":4057702400,"uptime":228588,"cpu":0.0461170848267622,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86"}
                          2023-12-16 11:11:47.234 - debug: proxmox.0 (206924) Requesting states for node pve
                          2023-12-16 11:11:47.246 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/status with content: {"data":{"boot-info":{"mode":"efi","secureboot":0},"swap":{"total":7885287424,"free":7885025280,"used":262144},"current-kernel":{"sysname":"Linux","release":"6.5.11-7-pve","version":"#1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","machine":"x86_64"},"loadavg":["0.14","0.12","0.14"],"memory":{"used":4036128768,"free":12288991232,"total":16325120000},"ksm":{"shared":0},"cpuinfo":{"cpus":4,"cores":4,"user_hz":100,"hvm":"1","sockets":1,"model":"Intel(R) Pentium(R) Silver J5040 CPU @ 2.00GHz","flags":"fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch cpuid_fault cat_l2 cdp_l2 ssbd ibrs ibpb stibp ibrs_enhanced tpr_shadow flexpriority ept vpid ept_ad fsgsbase tsc_adjust sgx smep erms mpx rdt_a rdseed smap clflushopt intel_pt sha_ni xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts vnmi umip rdpid sgx_lc md_clear arch_capabilities","mhz":"2995.209"},"rootfs":{"free":52528586752,"total":68959993856,"avail":48978739200,"used":16431407104},"wait":0.000940660029787568,"idle":0,"uptime":228597,"cpu":0.0493062632280317,"pveversion":"pve-manager/8.1.3/b46aac3b42da5d15","kversion":"Linux 6.5.11-7-pve #1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)"}}
                          2023-12-16 11:11:47.513 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/disks/list with content: {"data":[{"wearout":100,"by_id_link":"/dev/disk/by-id/ata-Patriot_Burst_Elite_240GB_PBEIICB22122105143","size":240057409536,"gpt":1,"vendor":"ATA ","wwn":"unknown","osdid":-1,"osdid-list":null,"rpm":0,"devpath":"/dev/sda","used":"BIOS boot","health":"PASSED","serial":"PBEIICB22122105143","type":"ssd","model":"Patriot_Burst_Elite_240GB"}]}
                          2023-12-16 11:11:47.575 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/disks/smart?disk=/dev/sda with content: {"data":{"type":"ata","attributes":[{"raw":"0","fail":"-","name":"Raw_Read_Error_Rate","id":" 1","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"value":100,"worst":100,"normalized":100,"name":"Reallocated_Sector_Ct","flags":"-O--CK","threshold":50,"id":" 5","fail":"-","raw":"0"},{"fail":"-","raw":"5825","flags":"-O--CK","threshold":50,"id":" 9","name":"Power_On_Hours","worst":100,"normalized":100,"value":100},{"name":"Power_Cycle_Count","flags":"-O--CK","id":" 12","threshold":50,"value":100,"worst":100,"normalized":100,"fail":"-","raw":"41"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"160","name":"Unknown_Attribute","fail":"-","raw":"29"},{"fail":"-","raw":"100","flags":"-O--CK","threshold":50,"id":"161","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100},{"raw":"120","fail":"-","threshold":50,"id":"163","flags":"-O--CK","name":"Unknown_Attribute","normalized":100,"worst":100,"value":100},{"value":100,"normalized":100,"worst":100,"name":"Unknown_Attribute","threshold":50,"id":"164","flags":"-O--CK","raw":"14","fail":"-"},{"raw":"29","fail":"-","threshold":50,"id":"165","flags":"-O--CK","name":"Unknown_Attribute","normalized":100,"worst":100,"value":100},{"value":100,"normalized":100,"worst":100,"name":"Unknown_Attribute","threshold":50,"id":"166","flags":"-O--CK","raw":"1","fail":"-"},{"flags":"-O--CK","threshold":50,"id":"167","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100,"fail":"-","raw":"8"},{"raw":"0","fail":"-","name":"Unknown_Attribute","threshold":50,"id":"168","flags":"-O--CK","value":100,"normalized":100,"worst":100},{"fail":"-","raw":"100","value":100,"worst":100,"normalized":100,"name":"Unknown_Attribute","flags":"-O--CK","id":"169","threshold":50},{"normalized":100,"worst":100,"value":100,"threshold":50,"id":"175","flags":"-O--CK","name":"Program_Fail_Count_Chip","raw":"0","fail":"-"},{"raw":"8678","fail":"-","value":100,"normalized":100,"worst":100,"name":"Erase_Fail_Count_Chip","id":"176","threshold":50,"flags":"-O--CK"},{"raw":"574561","fail":"-","value":100,"normalized":100,"worst":100,"name":"Wear_Leveling_Count","id":"177","threshold":50,"flags":"-O--CK"},{"fail":"-","raw":"0","name":"Used_Rsvd_Blk_Cnt_Chip","flags":"-O--CK","id":"178","threshold":50,"value":100,"worst":100,"normalized":100},{"raw":"0","fail":"-","threshold":50,"id":"181","flags":"-O--CK","name":"Program_Fail_Cnt_Total","normalized":100,"worst":100,"value":100},{"name":"Erase_Fail_Count_Total","flags":"-O--CK","threshold":50,"id":"182","value":100,"worst":100,"normalized":100,"fail":"-","raw":"0"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"192","threshold":50,"name":"Power-Off_Retract_Count","fail":"-","raw":"22"},{"fail":"-","raw":"45","name":"Temperature_Celsius","flags":"-O--CK","threshold":50,"id":"194","value":100,"worst":100,"normalized":100},{"normalized":100,"worst":100,"value":100,"threshold":50,"id":"195","flags":"-O--CK","name":"Hardware_ECC_Recovered","raw":"155","fail":"-"},{"fail":"-","raw":"0","value":100,"worst":100,"normalized":100,"name":"Reallocated_Event_Count","flags":"-O--CK","id":"196","threshold":50},{"raw":"0","fail":"-","threshold":50,"id":"197","flags":"-O--CK","name":"Current_Pending_Sector","normalized":100,"worst":100,"value":100},{"flags":"-O--CK","id":"198","threshold":50,"name":"Offline_Uncorrectable","worst":100,"normalized":100,"value":100,"fail":"-","raw":"0"},{"raw":"0","fail":"-","normalized":100,"worst":100,"value":100,"threshold":50,"id":"199","flags":"-O--CK","name":"UDMA_CRC_Error_Count"},{"fail":"-","raw":"100","name":"Available_Reservd_Space","flags":"-O--CK","threshold":50,"id":"232","value":100,"worst":100,"normalized":100},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"241","name":"Total_LBAs_Written","fail":"-","raw":"20562"},{"name":"Total_LBAs_Read","flags":"-O--CK","threshold":50,"id":"242","value":100,"worst":100,"normalized":100,"fail":"-","raw":"82971"},{"name":"Unknown_Attribute","id":"245","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100,"raw":"56715","fail":"-"}],"health":"PASSED"}}
                          2023-12-16 11:11:47.587 - debug: proxmox.0 (206924) received 200 response from /cluster/ha/status/current with content: {"data":[{"id":"quorum","status":"OK","quorate":1,"node":"pve","type":"quorum"}]}
                          2023-12-16 11:11:47.602 - debug: proxmox.0 (206924) received 200 response from /cluster/resources with content: {"data":[{"vmid":100,"cpu":0.0261146581719999,"netout":43099540121,"name":"Shinobi","diskread":1490972672,"id":"lxc/100","mem":365735936,"maxdisk":16729894912,"uptime":141077,"status":"running","netin":52909991634,"diskwrite":262647808,"type":"lxc","template":0,"maxcpu":2,"maxmem":4294967296,"disk":2378874880,"node":"pve"},{"maxcpu":4,"disk":4724740096,"maxmem":8589934592,"node":"pve","netin":1102548917,"diskwrite":11619495936,"type":"lxc","template":0,"diskread":2164809728,"id":"lxc/104","name":"DebianIO","uptime":228567,"maxdisk":16729894912,"mem":2370220032,"status":"running","cpu":0.0242493254454285,"netout":1022054850,"vmid":104},{"uptime":228588,"maxdisk":68959993856,"mem":4057702400,"id":"node/pve","status":"online","cgroup-mode":2,"cpu":0.0461170848267622,"maxcpu":4,"node":"pve","maxmem":16325120000,"disk":16431407104,"level":"","type":"node"},{"id":"storage/pve/local","maxdisk":68959993856,"disk":16431407104,"plugintype":"dir","shared":0,"node":"pve","status":"available","storage":"local","content":"backup,vztmpl,iso","type":"storage"},{"maxdisk":140387549184,"id":"storage/pve/local-lvm","shared":0,"status":"available","node":"pve","disk":10234252335,"plugintype":"lvmthin","storage":"local-lvm","content":"rootdir,images","type":"storage"},{"storage":"nas","content":"backup,rootdir","type":"storage","maxdisk":2913889878016,"id":"storage/pve/nas","shared":1,"node":"pve","status":"available","disk":1963875176448,"plugintype":"cifs"},{"status":"ok","node":"pve","sdn":"localnetwork","type":"sdn","id":"sdn/pve/localnetwork"}]}
                          2023-12-16 11:11:47.642 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/lxc/100/status/current with content: {"data":{"ha":{"managed":0},"name":"Shinobi","swap":0,"pid":488202,"netin":52914051379,"maxswap":536870912,"mem":365359104,"status":"running","netout":43104216945,"disk":2378874880,"vmid":100,"maxmem":4294967296,"cpus":2,"diskread":1490972672,"uptime":141086,"type":"lxc","maxdisk":16729894912,"cpu":0.0224964981732962,"diskwrite":262647808}}
                          2023-12-16 11:11:47.643 - debug: proxmox.0 (206924) found states: [["proxmox.0.lxc.Shinobi","pid","default_num",488202],["proxmox.0.lxc.Shinobi","netin","sizeb",52914051379],["proxmox.0.lxc.Shinobi","mem_lev","level",8.51],["proxmox.0.lxc.Shinobi","mem","size",348],["proxmox.0.lxc.Shinobi","status","text","running"],["proxmox.0.lxc.Shinobi","netout","sizeb",43104216945],["proxmox.0.lxc.Shinobi","disk_lev","level",14.22],["proxmox.0.lxc.Shinobi","disk","size",2269],["proxmox.0.lxc.Shinobi","vmid","default_num",100],["proxmox.0.lxc.Shinobi","maxmem","size",4096],["proxmox.0.lxc.Shinobi","cpus","default_num",2],["proxmox.0.lxc.Shinobi","uptime","time",141086],["proxmox.0.lxc.Shinobi","type","text","lxc"],["proxmox.0.lxc.Shinobi","maxdisk","size",15955],["proxmox.0.lxc.Shinobi","cpu","level",2.24],["proxmox.0.lxc.Shinobi","diskwrite","size",250]]
                          2023-12-16 11:11:47.782 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/lxc/104/status/current with content: {"data":{"cpu":0.0218476170369629,"maxdisk":16729894912,"type":"lxc","diskwrite":11619495936,"cpus":4,"maxmem":8589934592,"vmid":104,"disk":4722827264,"uptime":228576,"diskread":2164809728,"maxswap":1073741824,"netout":1022113003,"status":"running","mem":2371432448,"swap":20480,"name":"DebianIO","ha":{"managed":0},"netin":1102609354,"pid":948}}
                          2023-12-16 11:11:47.782 - debug: proxmox.0 (206924) found states: [["proxmox.0.lxc.DebianIO","cpu","level",2.18],["proxmox.0.lxc.DebianIO","maxdisk","size",15955],["proxmox.0.lxc.DebianIO","type","text","lxc"],["proxmox.0.lxc.DebianIO","diskwrite","size",11081],["proxmox.0.lxc.DebianIO","cpus","default_num",4],["proxmox.0.lxc.DebianIO","maxmem","size",8192],["proxmox.0.lxc.DebianIO","vmid","default_num",104],["proxmox.0.lxc.DebianIO","disk_lev","level",28.23],["proxmox.0.lxc.DebianIO","disk","size",4504],["proxmox.0.lxc.DebianIO","uptime","time",228576],["proxmox.0.lxc.DebianIO","netout","sizeb",1022113003],["proxmox.0.lxc.DebianIO","status","text","running"],["proxmox.0.lxc.DebianIO","mem_lev","level",27.61],["proxmox.0.lxc.DebianIO","mem","size",2262],["proxmox.0.lxc.DebianIO","netin","sizeb",1102609354],["proxmox.0.lxc.DebianIO","pid","default_num",948]]
                          2023-12-16 11:11:48.348 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/storage/local/status with content: {"data":{"enabled":1,"avail":48978739200,"content":"iso,vztmpl,backup","used":16431407104,"active":1,"total":68959993856,"shared":0,"type":"dir"}}
                          2023-12-16 11:11:48.348 - debug: proxmox.0 (206924) found states: [["proxmox.0.storage.pve_local","enabled","default_num",1],["proxmox.0.storage.pve_local","avail","size",46710],["proxmox.0.storage.pve_local","content","text","iso,vztmpl,backup"],["proxmox.0.storage.pve_local","used_lev","level",23.83],["proxmox.0.storage.pve_local","used","size",15670],["proxmox.0.storage.pve_local","active","default_num",1],["proxmox.0.storage.pve_local","total","size",65765],["proxmox.0.storage.pve_local","shared","default_num",0],["proxmox.0.storage.pve_local","type","text","dir"]]
                          2023-12-16 11:11:48.969 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/storage/local-lvm/status with content: {"data":{"enabled":1,"avail":130153296849,"content":"images,rootdir","used":10234252335,"active":1,"total":140387549184,"shared":0,"type":"lvmthin"}}
                          2023-12-16 11:11:48.969 - debug: proxmox.0 (206924) found states: [["proxmox.0.storage.pve_local-lvm","enabled","default_num",1],["proxmox.0.storage.pve_local-lvm","avail","size",124124],["proxmox.0.storage.pve_local-lvm","content","text","images,rootdir"],["proxmox.0.storage.pve_local-lvm","used_lev","level",7.29],["proxmox.0.storage.pve_local-lvm","used","size",9760],["proxmox.0.storage.pve_local-lvm","active","default_num",1],["proxmox.0.storage.pve_local-lvm","total","size",133884],["proxmox.0.storage.pve_local-lvm","shared","default_num",0],["proxmox.0.storage.pve_local-lvm","type","text","lvmthin"]]
                          2023-12-16 11:11:49.505 - debug: proxmox.0 (206924) received 200 response from /nodes/pve/storage/nas/status with content: {"data":{"used":1963879501824,"active":1,"type":"cifs","shared":1,"total":2913889878016,"content":"backup,rootdir","avail":950010376192,"enabled":1}}
                          2023-12-16 11:11:49.505 - debug: proxmox.0 (206924) found states: [["proxmox.0.storage.nas","used_lev","level",67.4],["proxmox.0.storage.nas","used","size",1872901],["proxmox.0.storage.nas","active","default_num",1],["proxmox.0.storage.nas","type","text","cifs"],["proxmox.0.storage.nas","shared","default_num",1],["proxmox.0.storage.nas","total","size",2778902],["proxmox.0.storage.nas","content","text","backup,rootdir"],["proxmox.0.storage.nas","avail","size",906000],["proxmox.0.storage.nas","enabled","default_num",1]]
                          2023-12-16 11:11:56.971 - info: host.DebianIO stopInstance system.adapter.proxmox.0 (force=false, process=true)
                          2023-12-16 11:11:56.974 - info: proxmox.0 (206924) Got terminate signal TERMINATE_YOURSELF
                          2023-12-16 11:11:56.975 - debug: proxmox.0 (206924) clearing request timeout
                          2023-12-16 11:11:56.975 - info: proxmox.0 (206924) terminating
                          2023-12-16 11:11:56.976 - info: proxmox.0 (206924) Terminated (ADAPTER_REQUESTED_TERMINATION): Without reason
                          2023-12-16 11:11:57.015 - info: host.DebianIO stopInstance system.adapter.proxmox.0 send kill signal
                          2023-12-16 11:11:57.478 - info: proxmox.0 (206924) terminating
                          2023-12-16 11:11:57.512 - info: host.DebianIO instance system.adapter.proxmox.0 terminated with code 11 (ADAPTER_REQUESTED_TERMINATION)
                          2023-12-16 11:12:00.063 - info: host.DebianIO instance system.adapter.proxmox.0 started with pid 211668
                          2023-12-16 11:12:00.617 - debug: proxmox.0 (211668) Redis Objects: Use Redis connection: 127.0.0.1:9001
                          2023-12-16 11:12:00.638 - debug: proxmox.0 (211668) Objects client ready ... initialize now
                          2023-12-16 11:12:00.639 - debug: proxmox.0 (211668) Objects create System PubSub Client
                          2023-12-16 11:12:00.640 - debug: proxmox.0 (211668) Objects create User PubSub Client
                          2023-12-16 11:12:00.666 - debug: proxmox.0 (211668) Objects client initialize lua scripts
                          2023-12-16 11:12:00.673 - debug: proxmox.0 (211668) Objects connected to redis: 127.0.0.1:9001
                          2023-12-16 11:12:00.686 - debug: proxmox.0 (211668) Redis States: Use Redis connection: 127.0.0.1:9000
                          2023-12-16 11:12:00.692 - debug: proxmox.0 (211668) States create System PubSub Client
                          2023-12-16 11:12:00.693 - debug: proxmox.0 (211668) States create User PubSub Client
                          2023-12-16 11:12:00.744 - debug: proxmox.0 (211668) States connected to redis: 127.0.0.1:9000
                          2023-12-16 11:12:00.820 - info: proxmox.0 (211668) starting. Version 2.2.2 in /opt/iobroker/node_modules/iobroker.proxmox, node: v18.19.0, js-controller: 5.0.17
                          2023-12-16 11:12:00.831 - warn: proxmox.0 (211668) Using Proxmox API: https://192.168.178.60:8006/api2/json
                          2023-12-16 11:12:00.984 - debug: proxmox.0 (211668) received 200 response from /access/ticket?username=root@pam&password=sweedi74 with content: {"data":{"username":"root@pam","cap":{"vms":{"VM.Audit":1,"VM.Backup":1,"VM.Config.CDROM":1,"VM.Monitor":1,"VM.Migrate":1,"VM.Config.Disk":1,"VM.Snapshot":1,"VM.Config.Memory":1,"VM.Console":1,"VM.Clone":1,"VM.Config.HWType":1,"VM.Config.Cloudinit":1,"VM.Allocate":1,"VM.PowerMgmt":1,"VM.Config.Options":1,"Permissions.Modify":1,"VM.Config.CPU":1,"VM.Config.Network":1,"VM.Snapshot.Rollback":1},"access":{"Group.Allocate":1,"User.Modify":1,"Permissions.Modify":1},"dc":{"SDN.Use":1,"SDN.Audit":1,"SDN.Allocate":1,"Sys.Audit":1,"Sys.Modify":1},"mapping":{"Mapping.Use":1,"Permissions.Modify":1,"Mapping.Modify":1,"Mapping.Audit":1},"nodes":{"Sys.PowerMgmt":1,"Sys.Audit":1,"Sys.Console":1,"Sys.Modify":1,"Permissions.Modify":1,"Sys.Incoming":1,"Sys.Syslog":1},"sdn":{"Permissions.Modify":1,"SDN.Use":1,"SDN.Audit":1,"SDN.Allocate":1},"storage":{"Datastore.AllocateTemplate":1,"Datastore.Audit":1,"Datastore.AllocateSpace":1,"Permissions.Modify":1,"Datastore.Allocate":1}},"CSRFPreventionToken":"657D77F0:HktEa4wYWSWtS2XW7oBlyZJiL82thkPvFOzU9ybz9oY","ticket":"PVE:root@pam:657D77F0::EHOyME2Yqvxxl8dSj5bKnYPkDrDe7zpAB3Opw9HWtLq1WDFa4kkotLutWSqRhoxB6ziD4PNKqGKjSp4W5HiBdudf77RWV9jdyxKmV8BejgxbYdQ9ENI6osHiI5EjWSvpOMCJ/VIK2VoF4hhN6jTBO3FsQAWoySd+7Ed6gJUtG/JlOPdPP7Ru4U4BVYogVNvLXZ0NMPrbEQyhrGiAyiNNoIW5lV/4Y9agjw84yZkuPG493Xmm2mLds4ObDFIRlchtMKWDuTs7+X6cclbgf6s6dmVUpd4fG9prsQ75OY7tIM9twhYxoBximIpLoFHv29yyUqS5woP5IXsNn9H8zcTLKA=="}}
                          2023-12-16 11:12:00.984 - debug: proxmox.0 (211668) dataticket: {"data":{"username":"root@pam","cap":{"vms":{"VM.Audit":1,"VM.Backup":1,"VM.Config.CDROM":1,"VM.Monitor":1,"VM.Migrate":1,"VM.Config.Disk":1,"VM.Snapshot":1,"VM.Config.Memory":1,"VM.Console":1,"VM.Clone":1,"VM.Config.HWType":1,"VM.Config.Cloudinit":1,"VM.Allocate":1,"VM.PowerMgmt":1,"VM.Config.Options":1,"Permissions.Modify":1,"VM.Config.CPU":1,"VM.Config.Network":1,"VM.Snapshot.Rollback":1},"access":{"Group.Allocate":1,"User.Modify":1,"Permissions.Modify":1},"dc":{"SDN.Use":1,"SDN.Audit":1,"SDN.Allocate":1,"Sys.Audit":1,"Sys.Modify":1},"mapping":{"Mapping.Use":1,"Permissions.Modify":1,"Mapping.Modify":1,"Mapping.Audit":1},"nodes":{"Sys.PowerMgmt":1,"Sys.Audit":1,"Sys.Console":1,"Sys.Modify":1,"Permissions.Modify":1,"Sys.Incoming":1,"Sys.Syslog":1},"sdn":{"Permissions.Modify":1,"SDN.Use":1,"SDN.Audit":1,"SDN.Allocate":1},"storage":{"Datastore.AllocateTemplate":1,"Datastore.Audit":1,"Datastore.AllocateSpace":1,"Permissions.Modify":1,"Datastore.Allocate":1}},"CSRFPreventionToken":"657D77F0:HktEa4wYWSWtS2XW7oBlyZJiL82thkPvFOzU9ybz9oY","ticket":"PVE:root@pam:657D77F0::EHOyME2Yqvxxl8dSj5bKnYPkDrDe7zpAB3Opw9HWtLq1WDFa4kkotLutWSqRhoxB6ziD4PNKqGKjSp4W5HiBdudf77RWV9jdyxKmV8BejgxbYdQ9ENI6osHiI5EjWSvpOMCJ/VIK2VoF4hhN6jTBO3FsQAWoySd+7Ed6gJUtG/JlOPdPP7Ru4U4BVYogVNvLXZ0NMPrbEQyhrGiAyiNNoIW5lV/4Y9agjw84yZkuPG493Xmm2mLds4ObDFIRlchtMKWDuTs7+X6cclbgf6s6dmVUpd4fG9prsQ75OY7tIM9twhYxoBximIpLoFHv29yyUqS5woP5IXsNn9H8zcTLKA=="}}
                          2023-12-16 11:12:00.984 - debug: proxmox.0 (211668) Updating ticket to "PVE:root@pam:657D77F0::EHOyME2Yqvxxl8dSj5bKnYPkDrDe7zpAB3Opw9HWtLq1WDFa4kkotLutWSqRhoxB6ziD4PNKqGKjSp4W5HiBdudf77RWV9jdyxKmV8BejgxbYdQ9ENI6osHiI5EjWSvpOMCJ/VIK2VoF4hhN6jTBO3FsQAWoySd+7Ed6gJUtG/JlOPdPP7Ru4U4BVYogVNvLXZ0NMPrbEQyhrGiAyiNNoIW5lV/4Y9agjw84yZkuPG493Xmm2mLds4ObDFIRlchtMKWDuTs7+X6cclbgf6s6dmVUpd4fG9prsQ75OY7tIM9twhYxoBximIpLoFHv29yyUqS5woP5IXsNn9H8zcTLKA==" and CSRF to "657D77F0:HktEa4wYWSWtS2XW7oBlyZJiL82thkPvFOzU9ybz9oY"
                          2023-12-16 11:12:01.001 - debug: proxmox.0 (211668) [readObjects] reading objects: {"proxmox.0.info":{"_id":"proxmox.0.info","type":"channel","common":{"name":"Information"},"native":{},"from":"system.adapter.proxmox.0","ts":1702721520810,"acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"user":"system.user.admin","enums":{}},"proxmox.0.node_pve":{"type":"channel","common":{"name":"pve"},"native":{"type":"node"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1690911801585,"_id":"proxmox.0.node_pve","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage_nas":{"type":"channel","common":{"name":"nas"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1690911802484,"_id":"proxmox.0.storage_nas","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.lxc_DebianIO":{"type":"channel","common":{"name":"DebianIO"},"native":{"type":"lxc"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1698233495846,"_id":"proxmox.0.lxc_DebianIO","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage_pve_local-lvm":{"type":"channel","common":{"name":"local-lvm"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1699381262893,"_id":"proxmox.0.storage_pve_local-lvm","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage_pve_local":{"type":"channel","common":{"name":"local"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1699381263498,"_id":"proxmox.0.storage_pve_local","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.lxc_Shinobi":{"type":"channel","common":{"name":"Shinobi"},"native":{"type":"lxc"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702714582554,"_id":"proxmox.0.lxc_Shinobi","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.lxc.Shinobi":{"type":"channel","common":{"name":"Shinobi"},"native":{"type":"lxc"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702717033201,"_id":"proxmox.0.lxc.Shinobi","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.lxc.DebianIO":{"type":"channel","common":{"name":"DebianIO"},"native":{"type":"lxc"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702717033347,"_id":"proxmox.0.lxc.DebianIO","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage.pve_local":{"type":"channel","common":{"name":"local"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702717033480,"_id":"proxmox.0.storage.pve_local","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage.pve_local-lvm":{"type":"channel","common":{"name":"local-lvm"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702717034054,"_id":"proxmox.0.storage.pve_local-lvm","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}},"proxmox.0.storage.nas":{"type":"channel","common":{"name":"nas"},"native":{"type":"storage"},"from":"system.adapter.proxmox.0","user":"system.user.admin","ts":1702717034629,"_id":"proxmox.0.storage.nas","acl":{"object":1636,"owner":"system.user.admin","ownerGroup":"system.group.administrator"},"enums":{}}}
                          2023-12-16 11:12:01.016 - debug: proxmox.0 (211668) received 200 response from /nodes with content: {"data":[{"cpu":0.0510496183206107,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","status":"online","uptime":228609,"mem":3993620480,"maxdisk":68959993856,"id":"node/pve","level":"","type":"node","node":"pve","disk":16431407104,"maxmem":16325120000,"maxcpu":4}]}
                          2023-12-16 11:12:01.016 - debug: proxmox.0 (211668) Nodes: [{"cpu":0.0510496183206107,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","status":"online","uptime":228609,"mem":3993620480,"maxdisk":68959993856,"id":"node/pve","level":"","type":"node","node":"pve","disk":16431407104,"maxmem":16325120000,"maxcpu":4}]
                          2023-12-16 11:12:01.017 - debug: proxmox.0 (211668) Node: {"cpu":0.0510496183206107,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","status":"online","uptime":228609,"mem":3993620480,"maxdisk":68959993856,"id":"node/pve","level":"","type":"node","node":"pve","disk":16431407104,"maxmem":16325120000,"maxcpu":4}
                          2023-12-16 11:12:01.071 - info: tuya.0 (205300) bf20c93cd48d53b9335f4u: Error on Reconnect (1): connection timed out
                          2023-12-16 11:12:01.081 - debug: proxmox.0 (211668) Requesting states for node pve
                          2023-12-16 11:12:01.096 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/status with content: {"data":{"cpu":0.0552006710543296,"pveversion":"pve-manager/8.1.3/b46aac3b42da5d15","kversion":"Linux 6.5.11-7-pve #1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","ksm":{"shared":0},"cpuinfo":{"hvm":"1","sockets":1,"model":"Intel(R) Pentium(R) Silver J5040 CPU @ 2.00GHz","cpus":4,"cores":4,"user_hz":100,"flags":"fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch cpuid_fault cat_l2 cdp_l2 ssbd ibrs ibpb stibp ibrs_enhanced tpr_shadow flexpriority ept vpid ept_ad fsgsbase tsc_adjust sgx smep erms mpx rdt_a rdseed smap clflushopt intel_pt sha_ni xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts vnmi umip rdpid sgx_lc md_clear arch_capabilities","mhz":"2995.209"},"wait":0.00103239127629372,"rootfs":{"free":52528586752,"total":68959993856,"avail":48978739200,"used":16431407104},"idle":0,"uptime":228611,"current-kernel":{"machine":"x86_64","version":"#1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","release":"6.5.11-7-pve","sysname":"Linux"},"loadavg":["0.17","0.13","0.15"],"memory":{"used":4037464064,"free":12287655936,"total":16325120000},"boot-info":{"mode":"efi","secureboot":0},"swap":{"used":262144,"total":7885287424,"free":7885025280}}}
                          2023-12-16 11:12:01.371 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/list with content: {"data":[{"devpath":"/dev/sda","used":"BIOS boot","health":"PASSED","serial":"PBEIICB22122105143","type":"ssd","model":"Patriot_Burst_Elite_240GB","osdid-list":null,"rpm":0,"vendor":"ATA ","wwn":"unknown","osdid":-1,"wearout":100,"by_id_link":"/dev/disk/by-id/ata-Patriot_Burst_Elite_240GB_PBEIICB22122105143","size":240057409536,"gpt":1}]}
                          2023-12-16 11:12:01.435 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/smart?disk=/dev/sda with content: {"data":{"health":"PASSED","attributes":[{"fail":"-","raw":"0","value":100,"worst":100,"normalized":100,"name":"Raw_Read_Error_Rate","flags":"-O--CK","id":" 1","threshold":50},{"name":"Reallocated_Sector_Ct","flags":"-O--CK","id":" 5","threshold":50,"value":100,"worst":100,"normalized":100,"fail":"-","raw":"0"},{"flags":"-O--CK","id":" 9","threshold":50,"name":"Power_On_Hours","worst":100,"normalized":100,"value":100,"fail":"-","raw":"5825"},{"value":100,"normalized":100,"worst":100,"name":"Power_Cycle_Count","threshold":50,"id":" 12","flags":"-O--CK","raw":"41","fail":"-"},{"raw":"29","fail":"-","name":"Unknown_Attribute","id":"160","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"threshold":50,"id":"161","flags":"-O--CK","name":"Unknown_Attribute","normalized":100,"worst":100,"value":100,"raw":"100","fail":"-"},{"raw":"120","fail":"-","name":"Unknown_Attribute","id":"163","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"fail":"-","raw":"14","flags":"-O--CK","threshold":50,"id":"164","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100},{"flags":"-O--CK","threshold":50,"id":"165","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100,"fail":"-","raw":"29"},{"raw":"1","fail":"-","normalized":100,"worst":100,"value":100,"threshold":50,"id":"166","flags":"-O--CK","name":"Unknown_Attribute"},{"fail":"-","raw":"8","flags":"-O--CK","threshold":50,"id":"167","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100},{"name":"Unknown_Attribute","flags":"-O--CK","id":"168","threshold":50,"value":100,"worst":100,"normalized":100,"fail":"-","raw":"0"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"169","threshold":50,"name":"Unknown_Attribute","fail":"-","raw":"100"},{"normalized":100,"worst":100,"value":100,"id":"175","threshold":50,"flags":"-O--CK","name":"Program_Fail_Count_Chip","raw":"0","fail":"-"},{"value":100,"worst":100,"normalized":100,"name":"Erase_Fail_Count_Chip","flags":"-O--CK","threshold":50,"id":"176","fail":"-","raw":"8678"},{"name":"Wear_Leveling_Count","flags":"-O--CK","threshold":50,"id":"177","value":100,"worst":100,"normalized":100,"fail":"-","raw":"574561"},{"raw":"0","fail":"-","value":100,"normalized":100,"worst":100,"name":"Used_Rsvd_Blk_Cnt_Chip","id":"178","threshold":50,"flags":"-O--CK"},{"raw":"0","fail":"-","name":"Program_Fail_Cnt_Total","threshold":50,"id":"181","flags":"-O--CK","value":100,"normalized":100,"worst":100},{"id":"182","threshold":50,"flags":"-O--CK","name":"Erase_Fail_Count_Total","normalized":100,"worst":100,"value":100,"raw":"0","fail":"-"},{"name":"Power-Off_Retract_Count","id":"192","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100,"raw":"22","fail":"-"},{"fail":"-","raw":"45","name":"Temperature_Celsius","flags":"-O--CK","id":"194","threshold":50,"value":100,"worst":100,"normalized":100},{"name":"Hardware_ECC_Recovered","flags":"-O--CK","id":"195","threshold":50,"value":100,"worst":100,"normalized":100,"fail":"-","raw":"155"},{"id":"196","threshold":50,"flags":"-O--CK","name":"Reallocated_Event_Count","normalized":100,"worst":100,"value":100,"raw":"0","fail":"-"},{"raw":"0","fail":"-","name":"Current_Pending_Sector","id":"197","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"fail":"-","raw":"0","worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"198","threshold":50,"name":"Offline_Uncorrectable"},{"threshold":50,"id":"199","flags":"-O--CK","name":"UDMA_CRC_Error_Count","normalized":100,"worst":100,"value":100,"raw":"0","fail":"-"},{"flags":"-O--CK","threshold":50,"id":"232","name":"Available_Reservd_Space","worst":100,"normalized":100,"value":100,"fail":"-","raw":"100"},{"raw":"20562","fail":"-","name":"Total_LBAs_Written","threshold":50,"id":"241","flags":"-O--CK","value":100,"normalized":100,"worst":100},{"normalized":100,"worst":100,"value":100,"id":"242","threshold":50,"flags":"-O--CK","name":"Total_LBAs_Read","raw":"82971","fail":"-"},{"raw":"56715","fail":"-","normalized":100,"worst":100,"value":100,"id":"245","threshold":50,"flags":"-O--CK","name":"Unknown_Attribute"}],"type":"ata"}}
                          2023-12-16 11:12:01.448 - debug: proxmox.0 (211668) received 200 response from /cluster/ha/status/current with content: {"data":[{"id":"quorum","type":"quorum","status":"OK","quorate":1,"node":"pve"}]}
                          2023-12-16 11:12:01.477 - debug: proxmox.0 (211668) received 200 response from /cluster/resources with content: {"data":[{"uptime":141097,"mem":364195840,"maxdisk":16729894912,"diskread":1490972672,"name":"Shinobi","id":"lxc/100","status":"running","netout":43108529618,"cpu":0.0194493497227651,"vmid":100,"maxcpu":2,"node":"pve","maxmem":4294967296,"disk":2378874880,"diskwrite":262647808,"netin":52919345379,"template":0,"type":"lxc"},{"node":"pve","maxmem":8589934592,"disk":4722831360,"maxcpu":4,"template":0,"type":"lxc","diskwrite":11619516416,"netin":1102698503,"status":"running","uptime":228587,"mem":2321465344,"maxdisk":16729894912,"id":"lxc/104","diskread":2164809728,"name":"DebianIO","netout":1022234086,"cpu":0.0261148557658611,"vmid":104},{"disk":16431407104,"maxmem":16325120000,"node":"pve","maxcpu":4,"type":"node","level":"","status":"online","id":"node/pve","uptime":228609,"mem":3993620480,"maxdisk":68959993856,"cpu":0.0510496183206107,"cgroup-mode":2},{"plugintype":"lvmthin","disk":10234252335,"node":"pve","status":"available","shared":0,"id":"storage/pve/local-lvm","maxdisk":140387549184,"type":"storage","content":"rootdir,images","storage":"local-lvm"},{"type":"storage","content":"backup,rootdir","storage":"nas","node":"pve","status":"available","shared":1,"plugintype":"cifs","disk":1963883827200,"maxdisk":2913889878016,"id":"storage/pve/nas"},{"storage":"local","type":"storage","content":"iso,vztmpl,backup","maxdisk":68959993856,"id":"storage/pve/local","node":"pve","status":"available","shared":0,"plugintype":"dir","disk":16431407104},{"sdn":"localnetwork","node":"pve","status":"ok","type":"sdn","id":"sdn/pve/localnetwork"}]}
                          2023-12-16 11:12:01.534 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/100/status/current with content: {"data":{"maxswap":536870912,"status":"running","mem":364699648,"netout":43109392605,"ha":{"managed":0},"swap":0,"name":"Shinobi","pid":488202,"netin":52920581108,"maxdisk":16729894912,"type":"lxc","cpu":0.0241888161197654,"diskwrite":262647808,"vmid":100,"disk":2378874880,"cpus":2,"maxmem":4294967296,"uptime":141100,"diskread":1490972672}}
                          2023-12-16 11:12:01.535 - debug: proxmox.0 (211668) new lxc: Shinobi - {"maxswap":536870912,"status":"running","mem":364699648,"netout":43109392605,"ha":{"managed":0},"swap":0,"name":"Shinobi","pid":488202,"netin":52920581108,"maxdisk":16729894912,"type":"lxc","cpu":0.0241888161197654,"diskwrite":262647808,"vmid":100,"disk":2378874880,"cpus":2,"maxmem":4294967296,"uptime":141100,"diskread":1490972672}
                          2023-12-16 11:12:01.535 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.Shinobi","status","text","running"],["proxmox.0.lxc.Shinobi","mem_lev","level",8.49],["proxmox.0.lxc.Shinobi","mem","size",348],["proxmox.0.lxc.Shinobi","netout","sizeb",43109392605],["proxmox.0.lxc.Shinobi","pid","default_num",488202],["proxmox.0.lxc.Shinobi","netin","sizeb",52920581108],["proxmox.0.lxc.Shinobi","maxdisk","size",15955],["proxmox.0.lxc.Shinobi","type","text","lxc"],["proxmox.0.lxc.Shinobi","cpu","level",2.41],["proxmox.0.lxc.Shinobi","diskwrite","size",250],["proxmox.0.lxc.Shinobi","vmid","default_num",100],["proxmox.0.lxc.Shinobi","disk_lev","level",14.22],["proxmox.0.lxc.Shinobi","disk","size",2269],["proxmox.0.lxc.Shinobi","cpus","default_num",2],["proxmox.0.lxc.Shinobi","maxmem","size",4096],["proxmox.0.lxc.Shinobi","uptime","time",141100]]
                          2023-12-16 11:12:01.655 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/104/status/current with content: {"data":{"uptime":228590,"diskread":2164809728,"cpus":4,"maxmem":8589934592,"vmid":104,"disk":4722851840,"diskwrite":11619532800,"cpu":0.0257367550157141,"maxdisk":16729894912,"type":"lxc","netin":1102736332,"pid":948,"swap":20480,"name":"DebianIO","ha":{"managed":0},"netout":1022289851,"status":"running","mem":2373537792,"maxswap":1073741824}}
                          2023-12-16 11:12:01.655 - debug: proxmox.0 (211668) new lxc: DebianIO - {"uptime":228590,"diskread":2164809728,"cpus":4,"maxmem":8589934592,"vmid":104,"disk":4722851840,"diskwrite":11619532800,"cpu":0.0257367550157141,"maxdisk":16729894912,"type":"lxc","netin":1102736332,"pid":948,"swap":20480,"name":"DebianIO","ha":{"managed":0},"netout":1022289851,"status":"running","mem":2373537792,"maxswap":1073741824}
                          2023-12-16 11:12:01.655 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.DebianIO","uptime","time",228590],["proxmox.0.lxc.DebianIO","cpus","default_num",4],["proxmox.0.lxc.DebianIO","maxmem","size",8192],["proxmox.0.lxc.DebianIO","vmid","default_num",104],["proxmox.0.lxc.DebianIO","disk_lev","level",28.23],["proxmox.0.lxc.DebianIO","disk","size",4504],["proxmox.0.lxc.DebianIO","diskwrite","size",11081],["proxmox.0.lxc.DebianIO","cpu","level",2.57],["proxmox.0.lxc.DebianIO","maxdisk","size",15955],["proxmox.0.lxc.DebianIO","type","text","lxc"],["proxmox.0.lxc.DebianIO","netin","sizeb",1102736332],["proxmox.0.lxc.DebianIO","pid","default_num",948],["proxmox.0.lxc.DebianIO","netout","sizeb",1022289851],["proxmox.0.lxc.DebianIO","status","text","running"],["proxmox.0.lxc.DebianIO","mem_lev","level",27.63],["proxmox.0.lxc.DebianIO","mem","size",2264]]
                          2023-12-16 11:12:02.268 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/local-lvm/status with content: {"data":{"content":"rootdir,images","avail":130153296849,"enabled":1,"type":"lvmthin","total":140387549184,"shared":0,"used":10234252335,"active":1}}
                          2023-12-16 11:12:02.268 - debug: proxmox.0 (211668) new storage: local-lvm - {"content":"rootdir,images","avail":130153296849,"enabled":1,"type":"lvmthin","total":140387549184,"shared":0,"used":10234252335,"active":1}
                          2023-12-16 11:12:02.268 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.pve_local-lvm","content","text","rootdir,images"],["proxmox.0.storage.pve_local-lvm","avail","size",124124],["proxmox.0.storage.pve_local-lvm","enabled","default_num",1],["proxmox.0.storage.pve_local-lvm","type","text","lvmthin"],["proxmox.0.storage.pve_local-lvm","total","size",133884],["proxmox.0.storage.pve_local-lvm","shared","default_num",0],["proxmox.0.storage.pve_local-lvm","used_lev","level",7.29],["proxmox.0.storage.pve_local-lvm","used","size",9760],["proxmox.0.storage.pve_local-lvm","active","default_num",1]]
                          2023-12-16 11:12:02.836 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/nas/status with content: {"data":{"type":"cifs","total":2913889878016,"shared":1,"used":1963885137920,"active":1,"content":"backup,rootdir","avail":950004740096,"enabled":1}}
                          2023-12-16 11:12:02.837 - debug: proxmox.0 (211668) new storage: nas - {"type":"cifs","total":2913889878016,"shared":1,"used":1963885137920,"active":1,"content":"backup,rootdir","avail":950004740096,"enabled":1}
                          2023-12-16 11:12:02.837 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.nas","type","text","cifs"],["proxmox.0.storage.nas","total","size",2778902],["proxmox.0.storage.nas","shared","default_num",1],["proxmox.0.storage.nas","used_lev","level",67.4],["proxmox.0.storage.nas","used","size",1872907],["proxmox.0.storage.nas","active","default_num",1],["proxmox.0.storage.nas","content","text","backup,rootdir"],["proxmox.0.storage.nas","avail","size",905995],["proxmox.0.storage.nas","enabled","default_num",1]]
                          2023-12-16 11:12:03.381 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/local/status with content: {"data":{"type":"dir","total":68959993856,"shared":0,"used":16431411200,"active":1,"content":"iso,backup,vztmpl","avail":48978735104,"enabled":1}}
                          2023-12-16 11:12:03.381 - debug: proxmox.0 (211668) new storage: local - {"type":"dir","total":68959993856,"shared":0,"used":16431411200,"active":1,"content":"iso,backup,vztmpl","avail":48978735104,"enabled":1}
                          2023-12-16 11:12:03.382 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.pve_local","type","text","dir"],["proxmox.0.storage.pve_local","total","size",65765],["proxmox.0.storage.pve_local","shared","default_num",0],["proxmox.0.storage.pve_local","used_lev","level",23.83],["proxmox.0.storage.pve_local","used","size",15670],["proxmox.0.storage.pve_local","active","default_num",1],["proxmox.0.storage.pve_local","content","text","iso,backup,vztmpl"],["proxmox.0.storage.pve_local","avail","size",46710],["proxmox.0.storage.pve_local","enabled","default_num",1]]
                          2023-12-16 11:12:26.074 - info: tuya.0 (205300) bf20c93cd48d53b9335f4u: Error on Reconnect (3): connection timed out
                          2023-12-16 11:12:33.394 - debug: proxmox.0 (211668) sendRequest interval started
                          2023-12-16 11:12:33.410 - debug: proxmox.0 (211668) received 200 response from /nodes with content: {"data":[{"level":"","type":"node","maxcpu":4,"node":"pve","disk":16431415296,"maxmem":16325120000,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0357227911407478,"mem":4036329472,"maxdisk":68959993856,"uptime":228638,"id":"node/pve","status":"online"}]}
                          2023-12-16 11:12:33.410 - debug: proxmox.0 (211668) Nodes: [{"level":"","type":"node","maxcpu":4,"node":"pve","disk":16431415296,"maxmem":16325120000,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0357227911407478,"mem":4036329472,"maxdisk":68959993856,"uptime":228638,"id":"node/pve","status":"online"}]
                          2023-12-16 11:12:33.411 - debug: proxmox.0 (211668) Node: {"level":"","type":"node","maxcpu":4,"node":"pve","disk":16431415296,"maxmem":16325120000,"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0357227911407478,"mem":4036329472,"maxdisk":68959993856,"uptime":228638,"id":"node/pve","status":"online"}
                          2023-12-16 11:12:33.459 - debug: proxmox.0 (211668) Requesting states for node pve
                          2023-12-16 11:12:33.474 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/status with content: {"data":{"swap":{"free":7885025280,"total":7885287424,"used":262144},"boot-info":{"mode":"efi","secureboot":0},"current-kernel":{"version":"#1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","release":"6.5.11-7-pve","machine":"x86_64","sysname":"Linux"},"loadavg":["0.14","0.13","0.15"],"memory":{"used":4042465280,"free":12282654720,"total":16325120000},"rootfs":{"used":16431415296,"avail":48978731008,"total":68959993856,"free":52528578560},"wait":0.00120415982484948,"idle":0,"uptime":228643,"ksm":{"shared":0},"cpuinfo":{"cpus":4,"cores":4,"user_hz":100,"hvm":"1","sockets":1,"model":"Intel(R) Pentium(R) Silver J5040 CPU @ 2.00GHz","flags":"fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch cpuid_fault cat_l2 cdp_l2 ssbd ibrs ibpb stibp ibrs_enhanced tpr_shadow flexpriority ept vpid ept_ad fsgsbase tsc_adjust sgx smep erms mpx rdt_a rdseed smap clflushopt intel_pt sha_ni xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts vnmi umip rdpid sgx_lc md_clear arch_capabilities","mhz":"2995.209"},"kversion":"Linux 6.5.11-7-pve #1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","cpu":0.0610837438423645,"pveversion":"pve-manager/8.1.3/b46aac3b42da5d15"}}
                          2023-12-16 11:12:33.735 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/list with content: {"data":[{"vendor":"ATA ","wwn":"unknown","osdid":-1,"wearout":100,"by_id_link":"/dev/disk/by-id/ata-Patriot_Burst_Elite_240GB_PBEIICB22122105143","size":240057409536,"gpt":1,"used":"BIOS boot","devpath":"/dev/sda","health":"PASSED","type":"ssd","serial":"PBEIICB22122105143","model":"Patriot_Burst_Elite_240GB","osdid-list":null,"rpm":0}]}
                          2023-12-16 11:12:33.795 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/smart?disk=/dev/sda with content: {"data":{"type":"ata","health":"PASSED","attributes":[{"raw":"0","fail":"-","id":" 1","threshold":50,"flags":"-O--CK","name":"Raw_Read_Error_Rate","normalized":100,"worst":100,"value":100},{"raw":"0","fail":"-","id":" 5","threshold":50,"flags":"-O--CK","name":"Reallocated_Sector_Ct","normalized":100,"worst":100,"value":100},{"fail":"-","raw":"5825","flags":"-O--CK","threshold":50,"id":" 9","name":"Power_On_Hours","worst":100,"normalized":100,"value":100},{"fail":"-","raw":"41","worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":" 12","name":"Power_Cycle_Count"},{"flags":"-O--CK","threshold":50,"id":"160","name":"Unknown_Attribute","worst":100,"normalized":100,"value":100,"fail":"-","raw":"29"},{"value":100,"worst":100,"normalized":100,"name":"Unknown_Attribute","flags":"-O--CK","id":"161","threshold":50,"fail":"-","raw":"100"},{"fail":"-","raw":"120","worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"163","name":"Unknown_Attribute"},{"normalized":100,"worst":100,"value":100,"threshold":50,"id":"164","flags":"-O--CK","name":"Unknown_Attribute","raw":"14","fail":"-"},{"fail":"-","raw":"29","worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"165","threshold":50,"name":"Unknown_Attribute"},{"raw":"1","fail":"-","value":100,"normalized":100,"worst":100,"name":"Unknown_Attribute","threshold":50,"id":"166","flags":"-O--CK"},{"flags":"-O--CK","id":"167","threshold":50,"name":"Unknown_Attribute","worst":100,"normalized":100,"value":100,"fail":"-","raw":"8"},{"flags":"-O--CK","id":"168","threshold":50,"name":"Unknown_Attribute","worst":100,"normalized":100,"value":100,"fail":"-","raw":"0"},{"name":"Unknown_Attribute","id":"169","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100,"raw":"100","fail":"-"},{"raw":"0","fail":"-","name":"Program_Fail_Count_Chip","id":"175","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"normalized":100,"worst":100,"value":100,"id":"176","threshold":50,"flags":"-O--CK","name":"Erase_Fail_Count_Chip","raw":"8678","fail":"-"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"177","threshold":50,"name":"Wear_Leveling_Count","fail":"-","raw":"574561"},{"value":100,"normalized":100,"worst":100,"name":"Used_Rsvd_Blk_Cnt_Chip","id":"178","threshold":50,"flags":"-O--CK","raw":"0","fail":"-"},{"fail":"-","raw":"0","name":"Program_Fail_Cnt_Total","flags":"-O--CK","threshold":50,"id":"181","value":100,"worst":100,"normalized":100},{"raw":"0","fail":"-","threshold":50,"id":"182","flags":"-O--CK","name":"Erase_Fail_Count_Total","normalized":100,"worst":100,"value":100},{"value":100,"worst":100,"normalized":100,"name":"Power-Off_Retract_Count","flags":"-O--CK","id":"192","threshold":50,"fail":"-","raw":"22"},{"name":"Temperature_Celsius","flags":"-O--CK","threshold":50,"id":"194","value":100,"worst":100,"normalized":100,"fail":"-","raw":"45"},{"fail":"-","raw":"155","worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"195","name":"Hardware_ECC_Recovered"},{"raw":"0","fail":"-","id":"196","threshold":50,"flags":"-O--CK","name":"Reallocated_Event_Count","normalized":100,"worst":100,"value":100},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"197","threshold":50,"name":"Current_Pending_Sector","fail":"-","raw":"0"},{"flags":"-O--CK","id":"198","threshold":50,"name":"Offline_Uncorrectable","worst":100,"normalized":100,"value":100,"fail":"-","raw":"0"},{"raw":"0","fail":"-","value":100,"normalized":100,"worst":100,"name":"UDMA_CRC_Error_Count","threshold":50,"id":"199","flags":"-O--CK"},{"raw":"100","fail":"-","normalized":100,"worst":100,"value":100,"id":"232","threshold":50,"flags":"-O--CK","name":"Available_Reservd_Space"},{"name":"Total_LBAs_Written","id":"241","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100,"raw":"20562","fail":"-"},{"name":"Total_LBAs_Read","flags":"-O--CK","id":"242","threshold":50,"value":100,"worst":100,"normalized":100,"fail":"-","raw":"82971"},{"raw":"56715","fail":"-","normalized":100,"worst":100,"value":100,"id":"245","threshold":50,"flags":"-O--CK","name":"Unknown_Attribute"}]}}
                          2023-12-16 11:12:33.807 - debug: proxmox.0 (211668) received 200 response from /cluster/ha/status/current with content: {"data":[{"id":"quorum","quorate":1,"node":"pve","status":"OK","type":"quorum"}]}
                          2023-12-16 11:12:33.826 - debug: proxmox.0 (211668) received 200 response from /cluster/resources with content: {"data":[{"status":"running","name":"Shinobi","diskread":1490972672,"id":"lxc/100","uptime":141127,"mem":364953600,"maxdisk":16729894912,"netout":43121803725,"cpu":0.0162175755259984,"vmid":100,"maxmem":4294967296,"disk":2378874880,"node":"pve","maxcpu":2,"type":"lxc","template":0,"netin":52933134374,"diskwrite":262647808},{"status":"running","name":"DebianIO","diskread":2164809728,"id":"lxc/104","maxdisk":16729894912,"mem":2362511360,"uptime":228617,"vmid":104,"netout":1022393755,"cpu":0.0176851168696087,"disk":4722978816,"maxmem":8589934592,"node":"pve","maxcpu":4,"type":"lxc","template":0,"netin":1102871058,"diskwrite":11620073472},{"disk":16431415296,"maxmem":16325120000,"node":"pve","maxcpu":4,"type":"node","level":"","status":"online","id":"node/pve","mem":4036329472,"maxdisk":68959993856,"uptime":228638,"cpu":0.0357227911407478,"cgroup-mode":2},{"storage":"local","type":"storage","content":"iso,backup,vztmpl","id":"storage/pve/local","maxdisk":68959993856,"plugintype":"dir","disk":16431415296,"status":"available","node":"pve","shared":0},{"node":"pve","status":"available","shared":1,"plugintype":"cifs","disk":1963896541184,"maxdisk":2913889878016,"id":"storage/pve/nas","type":"storage","content":"backup,rootdir","storage":"nas"},{"type":"storage","content":"rootdir,images","storage":"local-lvm","plugintype":"lvmthin","disk":10234252335,"status":"available","node":"pve","shared":0,"id":"storage/pve/local-lvm","maxdisk":140387549184},{"sdn":"localnetwork","node":"pve","status":"ok","type":"sdn","id":"sdn/pve/localnetwork"}]}
                          2023-12-16 11:12:33.867 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/100/status/current with content: {"data":{"name":"Shinobi","swap":0,"ha":{"managed":0},"netin":52935495172,"pid":488202,"maxswap":536870912,"netout":43124360811,"status":"running","mem":364802048,"maxmem":4294967296,"cpus":2,"disk":2378874880,"vmid":100,"diskread":1490972672,"uptime":141132,"cpu":0.0235437722515437,"type":"lxc","maxdisk":16729894912,"diskwrite":262647808}}
                          2023-12-16 11:12:33.868 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.Shinobi","netin","sizeb",52935495172],["proxmox.0.lxc.Shinobi","pid","default_num",488202],["proxmox.0.lxc.Shinobi","netout","sizeb",43124360811],["proxmox.0.lxc.Shinobi","status","text","running"],["proxmox.0.lxc.Shinobi","mem_lev","level",8.49],["proxmox.0.lxc.Shinobi","mem","size",348],["proxmox.0.lxc.Shinobi","maxmem","size",4096],["proxmox.0.lxc.Shinobi","cpus","default_num",2],["proxmox.0.lxc.Shinobi","disk_lev","level",14.22],["proxmox.0.lxc.Shinobi","disk","size",2269],["proxmox.0.lxc.Shinobi","vmid","default_num",100],["proxmox.0.lxc.Shinobi","uptime","time",141132],["proxmox.0.lxc.Shinobi","cpu","level",2.35],["proxmox.0.lxc.Shinobi","type","text","lxc"],["proxmox.0.lxc.Shinobi","maxdisk","size",15955],["proxmox.0.lxc.Shinobi","diskwrite","size",250]]
                          2023-12-16 11:12:34.011 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/104/status/current with content: {"data":{"diskread":2164809728,"uptime":228622,"maxmem":8589934592,"cpus":4,"disk":4722987008,"vmid":104,"diskwrite":11620073472,"cpu":0.0311112762093258,"type":"lxc","maxdisk":16729894912,"pid":948,"netin":1102918186,"name":"DebianIO","swap":20480,"ha":{"managed":0},"netout":1022429176,"status":"running","mem":2365046784,"maxswap":1073741824}}
                          2023-12-16 11:12:34.012 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.DebianIO","uptime","time",228622],["proxmox.0.lxc.DebianIO","maxmem","size",8192],["proxmox.0.lxc.DebianIO","cpus","default_num",4],["proxmox.0.lxc.DebianIO","disk_lev","level",28.23],["proxmox.0.lxc.DebianIO","disk","size",4504],["proxmox.0.lxc.DebianIO","vmid","default_num",104],["proxmox.0.lxc.DebianIO","diskwrite","size",11082],["proxmox.0.lxc.DebianIO","cpu","level",3.11],["proxmox.0.lxc.DebianIO","type","text","lxc"],["proxmox.0.lxc.DebianIO","maxdisk","size",15955],["proxmox.0.lxc.DebianIO","pid","default_num",948],["proxmox.0.lxc.DebianIO","netin","sizeb",1102918186],["proxmox.0.lxc.DebianIO","netout","sizeb",1022429176],["proxmox.0.lxc.DebianIO","status","text","running"],["proxmox.0.lxc.DebianIO","mem_lev","level",27.53],["proxmox.0.lxc.DebianIO","mem","size",2255]]
                          2023-12-16 11:12:34.652 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/local/status with content: {"data":{"enabled":1,"avail":48978731008,"content":"backup,vztmpl,iso","used":16431415296,"active":1,"total":68959993856,"shared":0,"type":"dir"}}
                          2023-12-16 11:12:34.652 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.pve_local","enabled","default_num",1],["proxmox.0.storage.pve_local","avail","size",46710],["proxmox.0.storage.pve_local","content","text","backup,vztmpl,iso"],["proxmox.0.storage.pve_local","used_lev","level",23.83],["proxmox.0.storage.pve_local","used","size",15670],["proxmox.0.storage.pve_local","active","default_num",1],["proxmox.0.storage.pve_local","total","size",65765],["proxmox.0.storage.pve_local","shared","default_num",0],["proxmox.0.storage.pve_local","type","text","dir"]]
                          2023-12-16 11:12:35.176 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/nas/status with content: {"data":{"active":1,"used":1963898900480,"type":"cifs","total":2913889878016,"shared":1,"content":"rootdir,backup","avail":949990977536,"enabled":1}}
                          2023-12-16 11:12:35.177 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.nas","active","default_num",1],["proxmox.0.storage.nas","used_lev","level",67.4],["proxmox.0.storage.nas","used","size",1872920],["proxmox.0.storage.nas","type","text","cifs"],["proxmox.0.storage.nas","total","size",2778902],["proxmox.0.storage.nas","shared","default_num",1],["proxmox.0.storage.nas","content","text","rootdir,backup"],["proxmox.0.storage.nas","avail","size",905982],["proxmox.0.storage.nas","enabled","default_num",1]]
                          2023-12-16 11:12:35.692 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/local-lvm/status with content: {"data":{"used":10234252335,"active":1,"type":"lvmthin","total":140387549184,"shared":0,"content":"rootdir,images","enabled":1,"avail":130153296849}}
                          2023-12-16 11:12:35.693 - debug: proxmox.0 (211668) found states: [["proxmox.0.storage.pve_local-lvm","used_lev","level",7.29],["proxmox.0.storage.pve_local-lvm","used","size",9760],["proxmox.0.storage.pve_local-lvm","active","default_num",1],["proxmox.0.storage.pve_local-lvm","type","text","lvmthin"],["proxmox.0.storage.pve_local-lvm","total","size",133884],["proxmox.0.storage.pve_local-lvm","shared","default_num",0],["proxmox.0.storage.pve_local-lvm","content","text","rootdir,images"],["proxmox.0.storage.pve_local-lvm","enabled","default_num",1],["proxmox.0.storage.pve_local-lvm","avail","size",124124]]
                          2023-12-16 11:13:05.700 - debug: proxmox.0 (211668) sendRequest interval started
                          2023-12-16 11:13:05.713 - debug: proxmox.0 (211668) received 200 response from /nodes with content: {"data":[{"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0367016205910391,"mem":4048191488,"maxdisk":68959993856,"uptime":228668,"id":"node/pve","status":"online","level":"","type":"node","maxcpu":4,"node":"pve","maxmem":16325120000,"disk":16431415296}]}
                          2023-12-16 11:13:05.714 - debug: proxmox.0 (211668) Nodes: [{"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0367016205910391,"mem":4048191488,"maxdisk":68959993856,"uptime":228668,"id":"node/pve","status":"online","level":"","type":"node","maxcpu":4,"node":"pve","maxmem":16325120000,"disk":16431415296}]
                          2023-12-16 11:13:05.714 - debug: proxmox.0 (211668) Node: {"ssl_fingerprint":"3C:6C:59:66:6F:8D:C5:C5:D4:9D:8D:9F:DD:90:AF:CF:08:DF:3D:15:2E:CC:B1:E8:DD:97:39:B4:9A:1D:36:86","cpu":0.0367016205910391,"mem":4048191488,"maxdisk":68959993856,"uptime":228668,"id":"node/pve","status":"online","level":"","type":"node","maxcpu":4,"node":"pve","maxmem":16325120000,"disk":16431415296}
                          2023-12-16 11:13:05.763 - debug: proxmox.0 (211668) Requesting states for node pve
                          2023-12-16 11:13:05.776 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/status with content: {"data":{"loadavg":["0.09","0.11","0.14"],"memory":{"free":12279840768,"total":16325120000,"used":4045279232},"current-kernel":{"sysname":"Linux","machine":"x86_64","release":"6.5.11-7-pve","version":"#1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)"},"boot-info":{"mode":"efi","secureboot":0},"swap":{"total":7885287424,"free":7885025280,"used":262144},"pveversion":"pve-manager/8.1.3/b46aac3b42da5d15","cpu":0.0492085206175493,"kversion":"Linux 6.5.11-7-pve #1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z)","ksm":{"shared":0},"cpuinfo":{"hvm":"1","sockets":1,"model":"Intel(R) Pentium(R) Silver J5040 CPU @ 2.00GHz","cpus":4,"cores":4,"user_hz":100,"flags":"fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch cpuid_fault cat_l2 cdp_l2 ssbd ibrs ibpb stibp ibrs_enhanced tpr_shadow flexpriority ept vpid ept_ad fsgsbase tsc_adjust sgx smep erms mpx rdt_a rdseed smap clflushopt intel_pt sha_ni xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts vnmi umip rdpid sgx_lc md_clear arch_capabilities","mhz":"2995.209"},"idle":0,"uptime":228675,"wait":0.00121164744967755,"rootfs":{"avail":48978731008,"used":16431415296,"free":52528578560,"total":68959993856}}}
                          2023-12-16 11:13:06.053 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/list with content: {"data":[{"by_id_link":"/dev/disk/by-id/ata-Patriot_Burst_Elite_240GB_PBEIICB22122105143","wearout":100,"size":240057409536,"gpt":1,"vendor":"ATA ","wwn":"unknown","osdid":-1,"osdid-list":null,"rpm":0,"used":"BIOS boot","devpath":"/dev/sda","health":"PASSED","type":"ssd","serial":"PBEIICB22122105143","model":"Patriot_Burst_Elite_240GB"}]}
                          2023-12-16 11:13:06.115 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/disks/smart?disk=/dev/sda with content: {"data":{"type":"ata","attributes":[{"fail":"-","raw":"0","flags":"-O--CK","threshold":50,"id":" 1","name":"Raw_Read_Error_Rate","worst":100,"normalized":100,"value":100},{"normalized":100,"worst":100,"value":100,"id":" 5","threshold":50,"flags":"-O--CK","name":"Reallocated_Sector_Ct","raw":"0","fail":"-"},{"normalized":100,"worst":100,"value":100,"id":" 9","threshold":50,"flags":"-O--CK","name":"Power_On_Hours","raw":"5825","fail":"-"},{"raw":"41","fail":"-","threshold":50,"id":" 12","flags":"-O--CK","name":"Power_Cycle_Count","normalized":100,"worst":100,"value":100},{"fail":"-","raw":"29","worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"160","threshold":50,"name":"Unknown_Attribute"},{"raw":"100","fail":"-","value":100,"normalized":100,"worst":100,"name":"Unknown_Attribute","id":"161","threshold":50,"flags":"-O--CK"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"163","name":"Unknown_Attribute","fail":"-","raw":"120"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"164","name":"Unknown_Attribute","fail":"-","raw":"14"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","id":"165","threshold":50,"name":"Unknown_Attribute","fail":"-","raw":"29"},{"raw":"1","fail":"-","name":"Unknown_Attribute","id":"166","threshold":50,"flags":"-O--CK","value":100,"normalized":100,"worst":100},{"raw":"8","fail":"-","normalized":100,"worst":100,"value":100,"id":"167","threshold":50,"flags":"-O--CK","name":"Unknown_Attribute"},{"raw":"0","fail":"-","normalized":100,"worst":100,"value":100,"threshold":50,"id":"168","flags":"-O--CK","name":"Unknown_Attribute"},{"raw":"100","fail":"-","threshold":50,"id":"169","flags":"-O--CK","name":"Unknown_Attribute","normalized":100,"worst":100,"value":100},{"value":100,"worst":100,"normalized":100,"name":"Program_Fail_Count_Chip","flags":"-O--CK","threshold":50,"id":"175","fail":"-","raw":"0"},{"raw":"8678","fail":"-","id":"176","threshold":50,"flags":"-O--CK","name":"Erase_Fail_Count_Chip","normalized":100,"worst":100,"value":100},{"normalized":100,"worst":100,"value":100,"threshold":50,"id":"177","flags":"-O--CK","name":"Wear_Leveling_Count","raw":"574561","fail":"-"},{"raw":"0","fail":"-","value":100,"normalized":100,"worst":100,"name":"Used_Rsvd_Blk_Cnt_Chip","id":"178","threshold":50,"flags":"-O--CK"},{"threshold":50,"id":"181","flags":"-O--CK","name":"Program_Fail_Cnt_Total","normalized":100,"worst":100,"value":100,"raw":"0","fail":"-"},{"worst":100,"normalized":100,"value":100,"flags":"-O--CK","threshold":50,"id":"182","name":"Erase_Fail_Count_Total","fail":"-","raw":"0"},{"raw":"22","fail":"-","name":"Power-Off_Retract_Count","threshold":50,"id":"192","flags":"-O--CK","value":100,"normalized":100,"worst":100},{"raw":"45","fail":"-","normalized":100,"worst":100,"value":100,"id":"194","threshold":50,"flags":"-O--CK","name":"Temperature_Celsius"},{"fail":"-","raw":"155","name":"Hardware_ECC_Recovered","flags":"-O--CK","id":"195","threshold":50,"value":100,"worst":100,"normalized":100},{"raw":"0","fail":"-","normalized":100,"worst":100,"value":100,"id":"196","threshold":50,"flags":"-O--CK","name":"Reallocated_Event_Count"},{"raw":"0","fail":"-","name":"Current_Pending_Sector","threshold":50,"id":"197","flags":"-O--CK","value":100,"normalized":100,"worst":100},{"fail":"-","raw":"0","value":100,"worst":100,"normalized":100,"name":"Offline_Uncorrectable","flags":"-O--CK","threshold":50,"id":"198"},{"raw":"0","fail":"-","value":100,"normalized":100,"worst":100,"name":"UDMA_CRC_Error_Count","threshold":50,"id":"199","flags":"-O--CK"},{"name":"Available_Reservd_Space","threshold":50,"id":"232","flags":"-O--CK","value":100,"normalized":100,"worst":100,"raw":"100","fail":"-"},{"name":"Total_LBAs_Written","flags":"-O--CK","threshold":50,"id":"241","value":100,"worst":100,"normalized":100,"fail":"-","raw":"20562"},{"raw":"82971","fail":"-","value":100,"normalized":100,"worst":100,"name":"Total_LBAs_Read","id":"242","threshold":50,"flags":"-O--CK"},{"fail":"-","raw":"56715","name":"Unknown_Attribute","flags":"-O--CK","id":"245","threshold":50,"value":100,"worst":100,"normalized":100}],"health":"PASSED"}}
                          2023-12-16 11:13:06.127 - debug: proxmox.0 (211668) received 200 response from /cluster/ha/status/current with content: {"data":[{"id":"quorum","type":"quorum","quorate":1,"status":"OK","node":"pve"}]}
                          2023-12-16 11:13:06.143 - debug: proxmox.0 (211668) received 200 response from /cluster/resources with content: {"data":[{"id":"lxc/100","diskread":1490972672,"name":"Shinobi","mem":365195264,"maxdisk":16729894912,"uptime":141157,"status":"running","vmid":100,"netout":43135101442,"cpu":0.0195003341998477,"maxcpu":2,"maxmem":4294967296,"disk":2378874880,"node":"pve","netin":52946970371,"diskwrite":262647808,"type":"lxc","template":0},{"maxcpu":4,"disk":4722991104,"maxmem":8589934592,"node":"pve","netin":1103036850,"diskwrite":11620073472,"type":"lxc","template":0,"name":"DebianIO","diskread":2164809728,"id":"lxc/104","maxdisk":16729894912,"mem":2364608512,"uptime":228647,"status":"running","vmid":104,"netout":1022503969,"cpu":0.0180328345598081},{"type":"node","level":"","disk":16431415296,"maxmem":16325120000,"node":"pve","maxcpu":4,"cpu":0.0367016205910391,"cgroup-mode":2,"status":"online","id":"node/pve","mem":4048191488,"maxdisk":68959993856,"uptime":228668},{"storage":"local","type":"storage","content":"iso,backup,vztmpl","id":"storage/pve/local","maxdisk":68959993856,"plugintype":"dir","disk":16431415296,"status":"available","node":"pve","shared":0},{"storage":"nas","content":"rootdir,backup","type":"storage","maxdisk":2913889878016,"id":"storage/pve/nas","shared":1,"node":"pve","status":"available","disk":1963908993024,"plugintype":"cifs"},{"type":"storage","content":"images,rootdir","storage":"local-lvm","node":"pve","status":"available","shared":0,"plugintype":"lvmthin","disk":10234252335,"maxdisk":140387549184,"id":"storage/pve/local-lvm"},{"id":"sdn/pve/localnetwork","sdn":"localnetwork","node":"pve","status":"ok","type":"sdn"}]}
                          2023-12-16 11:13:06.184 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/100/status/current with content: {"data":{"maxswap":536870912,"mem":365805568,"status":"running","netout":43138068665,"ha":{"managed":0},"swap":0,"name":"Shinobi","pid":488202,"netin":52950452648,"maxdisk":16729894912,"type":"lxc","cpu":0.0241362704058496,"diskwrite":262647808,"vmid":100,"disk":2378874880,"cpus":2,"maxmem":4294967296,"uptime":141164,"diskread":1490972672}}
                          2023-12-16 11:13:06.184 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.Shinobi","mem_lev","level",8.52],["proxmox.0.lxc.Shinobi","mem","size",349],["proxmox.0.lxc.Shinobi","status","text","running"],["proxmox.0.lxc.Shinobi","netout","sizeb",43138068665],["proxmox.0.lxc.Shinobi","pid","default_num",488202],["proxmox.0.lxc.Shinobi","netin","sizeb",52950452648],["proxmox.0.lxc.Shinobi","maxdisk","size",15955],["proxmox.0.lxc.Shinobi","type","text","lxc"],["proxmox.0.lxc.Shinobi","cpu","level",2.41],["proxmox.0.lxc.Shinobi","diskwrite","size",250],["proxmox.0.lxc.Shinobi","vmid","default_num",100],["proxmox.0.lxc.Shinobi","disk_lev","level",14.22],["proxmox.0.lxc.Shinobi","disk","size",2269],["proxmox.0.lxc.Shinobi","cpus","default_num",2],["proxmox.0.lxc.Shinobi","maxmem","size",4096],["proxmox.0.lxc.Shinobi","uptime","time",141164]]
                          2023-12-16 11:13:06.329 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/lxc/104/status/current with content: {"data":{"pid":948,"netin":1103088554,"ha":{"managed":0},"swap":20480,"name":"DebianIO","mem":2367852544,"status":"running","netout":1022541614,"maxswap":1073741824,"uptime":228654,"diskread":2164809728,"vmid":104,"disk":4723093504,"cpus":4,"maxmem":8589934592,"diskwrite":11620466688,"maxdisk":16729894912,"type":"lxc","cpu":0.0212253671186049}}
                          2023-12-16 11:13:06.329 - debug: proxmox.0 (211668) found states: [["proxmox.0.lxc.DebianIO","pid","default_num",948],["proxmox.0.lxc.DebianIO","netin","sizeb",1103088554],["proxmox.0.lxc.DebianIO","mem_lev","level",27.57],["proxmox.0.lxc.DebianIO","mem","size",2258],["proxmox.0.lxc.DebianIO","status","text","running"],["proxmox.0.lxc.DebianIO","netout","sizeb",1022541614],["proxmox.0.lxc.DebianIO","uptime","time",228654],["proxmox.0.lxc.DebianIO","vmid","default_num",104],["proxmox.0.lxc.DebianIO","disk_lev","level",28.23],["proxmox.0.lxc.DebianIO","disk","size",4504],["proxmox.0.lxc.DebianIO","cpus","default_num",4],["proxmox.0.lxc.DebianIO","maxmem","size",8192],["proxmox.0.lxc.DebianIO","diskwrite","size",11082],["proxmox.0.lxc.DebianIO","maxdisk","size",15955],["proxmox.0.lxc.DebianIO","type","text","lxc"],["proxmox.0.lxc.DebianIO","cpu","level",2.12]]
                          2023-12-16 11:13:07.093 - debug: proxmox.0 (211668) received 200 response from /nodes/pve/storage/local/status with content: {"data":{"avail":48978731008,"enabled":1,"content":"iso,backup,vztmpl","used":16431415296,"active":1,"total":68959993856,"shared":0,"type":"dir"}}
                          
                          1 Antwort Letzte Antwort
                          0
                          • ? Ein ehemaliger Benutzer

                            @david-g sagte in [Neuer Adapter] Proxmox VM:

                            War das zuletzt nicht eine ähnliche Uhrzeit wo der Adapter zuletzt gelaufen is

                            Moin,

                            laut Deinem großen Log von gestern, war es 23:12 Uhr, wann machst Du denn ein Backup? Oder anders ausgedrückt, was läuft denn bei Dir um diese Uhrzeit?
                            Kannst Du mal in die Logs der Maschine schauen, also nicht ioBroker, sondern Linux, entweder

                            # sudo journalctl -g proxmox
                            # sudo journalctl -g error
                            

                            VG
                            Bernd

                            David G.D Online
                            David G.D Online
                            David G.
                            schrieb am zuletzt editiert von David G.
                            #331

                            @dp20eic

                            Das Backup ist da schon erledigt.
                            Läuft um 21 Uhr und ist nach ca 20min durch.

                            Wüsste grad nicht, dass da irgendwas besonderes läuft....

                            Last login: Thu Dec 14 22:28:11 2023 from 192.168.99.50
                            iobroker@iobroker:~$ sudo journalctl -g proxmox
                            [sudo] Passwort für iobroker:
                            -- Boot 7a1d7974a0ef4f31b90e22f35fe68af1 --
                            -- Boot e12b69c1eb2a41b6bf9bc268a74fd7f9 --
                            -- Boot 790e9a309e8a4ec9861d865498851593 --
                            -- Boot 487b12abffc44c2385c5afc18c2e9953 --
                            -- Boot 019714ebe44e4d068036a535263185da --
                            -- Boot 1752ccc778234a85894f9b57771e1fac --
                            -- Boot 4d351f0ba3924bd3a6ce9c46c25edcce --
                            -- Boot e1b0b90a1093466ab6a28859b40b2fc1 --
                            -- Boot f7258401e6d2425097cd24b02a45f923 --
                            -- Boot a0047fa38dd44178959c23b3f8a2934f --
                            -- Boot 07b7a6a260214c36b4477ee4c6aaeabc --
                            -- Boot aa24d439cfb5480ba0ffd3d0e803b008 --
                            -- Boot a486845c190b4a4f9ad87da03cc1a4d9 --
                            -- Boot e873ce4aff624c688e542d606a8df2a1 --
                            -- Boot 240c1cb9a46c43c1a8a1dc13145aad46 --
                            -- Boot 29b4468fd77646e1a16192f713cbb093 --
                            Sep 30 22:48:35 iobroker bash[509]: host.iobroker check insta>Sep 30 22:48:38 iobroker bash[509]: Send diag info: {"uuid":">Sep 30 22:53:09 iobroker bash[509]: Send diag info: {"uuid":">Okt 01 07:41:08 iobroker bash[509]: Send diag info: {"uuid":">-- Boot d30c9f2eeabf41bc9f99ec98319d5c7d --
                            Okt 01 08:59:05 iobroker bash[526]: host.iobroker check insta>Okt 01 08:59:07 iobroker bash[526]: Send diag info: {"uuid":">lines 1-23
                            
                            Last login: Sat Dec 16 11:25:37 2023 from 192.168.99.50
                            iobroker@iobroker:~$ sudo journalctl -g error
                            [sudo] Passwort für iobroker:
                            -- Boot 7a1d7974a0ef4f31b90e22f35fe68af1 --
                            -- Boot e12b69c1eb2a41b6bf9bc268a74fd7f9 --
                            -- Boot 790e9a309e8a4ec9861d865498851593 --
                            -- Boot 487b12abffc44c2385c5afc18c2e9953 --
                            Sep 29 22:02:47 iobroker kernel: usb 2-2: can't set config #1>-- Boot 019714ebe44e4d068036a535263185da --
                            -- Boot 1752ccc778234a85894f9b57771e1fac --
                            -- Boot 4d351f0ba3924bd3a6ce9c46c25edcce --
                            -- Boot e1b0b90a1093466ab6a28859b40b2fc1 --
                            -- Boot f7258401e6d2425097cd24b02a45f923 --
                            Sep 30 17:13:29 iobroker kernel: usb 2-1: can't set config #1>-- Boot a0047fa38dd44178959c23b3f8a2934f --
                            -- Boot 07b7a6a260214c36b4477ee4c6aaeabc --
                            -- Boot aa24d439cfb5480ba0ffd3d0e803b008 --
                            Sep 30 17:39:21 iobroker bash[524]: FATAL ERROR: Ineffective >-- Boot a486845c190b4a4f9ad87da03cc1a4d9 --
                            Sep 30 17:47:28 iobroker bash[528]: This error originated eit>Sep 30 17:47:28 iobroker bash[528]: ReferenceError: obj is no>Sep 30 17:47:35 iobroker bash[528]: This error originated eit>Sep 30 17:47:35 iobroker bash[528]: Error: ENOENT: no such fi>Sep 30 17:48:01 iobroker bash[528]: This error originated eit>Sep 30 17:48:01 iobroker bash[528]: Error: ENOENT: no such fi>Sep 30 17:51:43 iobroker bash[528]: This error originated eit>lines 1-23
                            

                            Anbei das Log von gestern.
                            Um 23:15:45 wurde das letzte mal ein DP aktualisiert.
                            iobroker.2023-12-15.zip
                            Sorry, aufgrund der Größe geht nur ein zip.

                            Zeigt eure Lovelace-Visualisierung klick
                            (Auch ideal um sich Anregungen zu holen)

                            Meine Tabellen für eure Visualisierung klick

                            arteckA 1 Antwort Letzte Antwort
                            0
                            • David G.D David G.

                              @dp20eic

                              Das Backup ist da schon erledigt.
                              Läuft um 21 Uhr und ist nach ca 20min durch.

                              Wüsste grad nicht, dass da irgendwas besonderes läuft....

                              Last login: Thu Dec 14 22:28:11 2023 from 192.168.99.50
                              iobroker@iobroker:~$ sudo journalctl -g proxmox
                              [sudo] Passwort für iobroker:
                              -- Boot 7a1d7974a0ef4f31b90e22f35fe68af1 --
                              -- Boot e12b69c1eb2a41b6bf9bc268a74fd7f9 --
                              -- Boot 790e9a309e8a4ec9861d865498851593 --
                              -- Boot 487b12abffc44c2385c5afc18c2e9953 --
                              -- Boot 019714ebe44e4d068036a535263185da --
                              -- Boot 1752ccc778234a85894f9b57771e1fac --
                              -- Boot 4d351f0ba3924bd3a6ce9c46c25edcce --
                              -- Boot e1b0b90a1093466ab6a28859b40b2fc1 --
                              -- Boot f7258401e6d2425097cd24b02a45f923 --
                              -- Boot a0047fa38dd44178959c23b3f8a2934f --
                              -- Boot 07b7a6a260214c36b4477ee4c6aaeabc --
                              -- Boot aa24d439cfb5480ba0ffd3d0e803b008 --
                              -- Boot a486845c190b4a4f9ad87da03cc1a4d9 --
                              -- Boot e873ce4aff624c688e542d606a8df2a1 --
                              -- Boot 240c1cb9a46c43c1a8a1dc13145aad46 --
                              -- Boot 29b4468fd77646e1a16192f713cbb093 --
                              Sep 30 22:48:35 iobroker bash[509]: host.iobroker check insta>Sep 30 22:48:38 iobroker bash[509]: Send diag info: {"uuid":">Sep 30 22:53:09 iobroker bash[509]: Send diag info: {"uuid":">Okt 01 07:41:08 iobroker bash[509]: Send diag info: {"uuid":">-- Boot d30c9f2eeabf41bc9f99ec98319d5c7d --
                              Okt 01 08:59:05 iobroker bash[526]: host.iobroker check insta>Okt 01 08:59:07 iobroker bash[526]: Send diag info: {"uuid":">lines 1-23
                              
                              Last login: Sat Dec 16 11:25:37 2023 from 192.168.99.50
                              iobroker@iobroker:~$ sudo journalctl -g error
                              [sudo] Passwort für iobroker:
                              -- Boot 7a1d7974a0ef4f31b90e22f35fe68af1 --
                              -- Boot e12b69c1eb2a41b6bf9bc268a74fd7f9 --
                              -- Boot 790e9a309e8a4ec9861d865498851593 --
                              -- Boot 487b12abffc44c2385c5afc18c2e9953 --
                              Sep 29 22:02:47 iobroker kernel: usb 2-2: can't set config #1>-- Boot 019714ebe44e4d068036a535263185da --
                              -- Boot 1752ccc778234a85894f9b57771e1fac --
                              -- Boot 4d351f0ba3924bd3a6ce9c46c25edcce --
                              -- Boot e1b0b90a1093466ab6a28859b40b2fc1 --
                              -- Boot f7258401e6d2425097cd24b02a45f923 --
                              Sep 30 17:13:29 iobroker kernel: usb 2-1: can't set config #1>-- Boot a0047fa38dd44178959c23b3f8a2934f --
                              -- Boot 07b7a6a260214c36b4477ee4c6aaeabc --
                              -- Boot aa24d439cfb5480ba0ffd3d0e803b008 --
                              Sep 30 17:39:21 iobroker bash[524]: FATAL ERROR: Ineffective >-- Boot a486845c190b4a4f9ad87da03cc1a4d9 --
                              Sep 30 17:47:28 iobroker bash[528]: This error originated eit>Sep 30 17:47:28 iobroker bash[528]: ReferenceError: obj is no>Sep 30 17:47:35 iobroker bash[528]: This error originated eit>Sep 30 17:47:35 iobroker bash[528]: Error: ENOENT: no such fi>Sep 30 17:48:01 iobroker bash[528]: This error originated eit>Sep 30 17:48:01 iobroker bash[528]: Error: ENOENT: no such fi>Sep 30 17:51:43 iobroker bash[528]: This error originated eit>lines 1-23
                              

                              Anbei das Log von gestern.
                              Um 23:15:45 wurde das letzte mal ein DP aktualisiert.
                              iobroker.2023-12-15.zip
                              Sorry, aufgrund der Größe geht nur ein zip.

                              arteckA Offline
                              arteckA Offline
                              arteck
                              Developer Most Active
                              schrieb am zuletzt editiert von arteck
                              #332

                              @david-g

                              2023-12-15 23:15:45.724  - debug: proxmox.0 (1014320)
                              

                              das sehe ich auch.. hmm... ich guck mal heute..

                              zigbee hab ich, zwave auch, nuc's genauso und HA auch

                              David G.D 1 Antwort Letzte Antwort
                              0
                              • arteckA arteck

                                @david-g

                                2023-12-15 23:15:45.724  - debug: proxmox.0 (1014320)
                                

                                das sehe ich auch.. hmm... ich guck mal heute..

                                David G.D Online
                                David G.D Online
                                David G.
                                schrieb am zuletzt editiert von
                                #333

                                @arteck
                                Was möchte er einem damit sagen?
                                Irgend ein Wert den er nicht verarbeitet bekommt?

                                Zeigt eure Lovelace-Visualisierung klick
                                (Auch ideal um sich Anregungen zu holen)

                                Meine Tabellen für eure Visualisierung klick

                                1 Antwort Letzte Antwort
                                0
                                • arteckA Offline
                                  arteckA Offline
                                  arteck
                                  Developer Most Active
                                  schrieb am zuletzt editiert von arteck
                                  #334

                                  ich bekomme auf allen 4 Instanzen Daten übertragen...

                                  sowohl für lxc als auch vm..
                                  adapter läuft sied 10.12.2023
                                  e30dfde5-2cad-48fd-a8a0-62cae113851a-grafik.png

                                  hier mein lxc mit checkmk
                                  75f07ea6-17f1-42e7-8fe8-b80eb9d3cdb3-grafik.png
                                  c6b16c6e-089f-410f-8825-5c77da4dc69e-grafik.png

                                  zigbee hab ich, zwave auch, nuc's genauso und HA auch

                                  D 1 Antwort Letzte Antwort
                                  0
                                  • arteckA arteck

                                    ich bekomme auf allen 4 Instanzen Daten übertragen...

                                    sowohl für lxc als auch vm..
                                    adapter läuft sied 10.12.2023
                                    e30dfde5-2cad-48fd-a8a0-62cae113851a-grafik.png

                                    hier mein lxc mit checkmk
                                    75f07ea6-17f1-42e7-8fe8-b80eb9d3cdb3-grafik.png
                                    c6b16c6e-089f-410f-8825-5c77da4dc69e-grafik.png

                                    D Offline
                                    D Offline
                                    dhd70106
                                    schrieb am zuletzt editiert von
                                    #335

                                    @arteck
                                    Aber trotzdem scheint irgendetwas nicht zu stimmen:
                                    IMG_0219.png IMG_0220.jpeg

                                    arteckA 1 Antwort Letzte Antwort
                                    0
                                    • D dhd70106

                                      @arteck
                                      Aber trotzdem scheint irgendetwas nicht zu stimmen:
                                      IMG_0219.png IMG_0220.jpeg

                                      arteckA Offline
                                      arteckA Offline
                                      arteck
                                      Developer Most Active
                                      schrieb am zuletzt editiert von arteck
                                      #336

                                      @dhd70106 schaust du in das richtige Verzeichniss ?? sonst hab ich keine idee

                                      ach ne du hast ja die flache Struktur... hast du nur eine proxmx Maschiene ??
                                      vielleicht liegts dadran... das der ein Problem hat mit

                                      ich habe 5 Nuks hier und mein iob fragt nicht den Proxmox wo iobroker läuft sondern eine andere..

                                      zigbee hab ich, zwave auch, nuc's genauso und HA auch

                                      David G.D 1 Antwort Letzte Antwort
                                      0
                                      • arteckA arteck

                                        @dhd70106 schaust du in das richtige Verzeichniss ?? sonst hab ich keine idee

                                        ach ne du hast ja die flache Struktur... hast du nur eine proxmx Maschiene ??
                                        vielleicht liegts dadran... das der ein Problem hat mit

                                        ich habe 5 Nuks hier und mein iob fragt nicht den Proxmox wo iobroker läuft sondern eine andere..

                                        David G.D Online
                                        David G.D Online
                                        David G.
                                        schrieb am zuletzt editiert von
                                        #337

                                        @arteck sagte in [Neuer Adapter] Proxmox VM:

                                        ich habe 5 Nuks hier und mein iob fragt nicht den Proxmox wo iobroker läuft sondern eine andere..

                                        Dürfte das denn einen Unterschied machen?
                                        Der API ist doch eigentlich egal wer was von wo abfragt oder?

                                        Zeigt eure Lovelace-Visualisierung klick
                                        (Auch ideal um sich Anregungen zu holen)

                                        Meine Tabellen für eure Visualisierung klick

                                        arteckA 1 Antwort Letzte Antwort
                                        0
                                        • David G.D David G.

                                          @arteck sagte in [Neuer Adapter] Proxmox VM:

                                          ich habe 5 Nuks hier und mein iob fragt nicht den Proxmox wo iobroker läuft sondern eine andere..

                                          Dürfte das denn einen Unterschied machen?
                                          Der API ist doch eigentlich egal wer was von wo abfragt oder?

                                          arteckA Offline
                                          arteckA Offline
                                          arteck
                                          Developer Most Active
                                          schrieb am zuletzt editiert von
                                          #338

                                          @david-g ja ist egal.. von wo der request kommt...

                                          wie gesagt keine Idee

                                          zigbee hab ich, zwave auch, nuc's genauso und HA auch

                                          David G.D 1 Antwort Letzte Antwort
                                          0
                                          Antworten
                                          • In einem neuen Thema antworten
                                          Anmelden zum Antworten
                                          • Älteste zuerst
                                          • Neuste zuerst
                                          • Meiste Stimmen


                                          Support us

                                          ioBroker
                                          Community Adapters
                                          Donate

                                          315

                                          Online

                                          32.4k

                                          Benutzer

                                          81.4k

                                          Themen

                                          1.3m

                                          Beiträge
                                          Community
                                          Impressum | Datenschutz-Bestimmungen | Nutzungsbedingungen | Einwilligungseinstellungen
                                          ioBroker Community 2014-2025
                                          logo
                                          • Anmelden

                                          • Du hast noch kein Konto? Registrieren

                                          • Anmelden oder registrieren, um zu suchen
                                          • Erster Beitrag
                                            Letzter Beitrag
                                          0
                                          • Home
                                          • Aktuell
                                          • Tags
                                          • Ungelesen 0
                                          • Kategorien
                                          • Unreplied
                                          • Beliebt
                                          • GitHub
                                          • Docu
                                          • Hilfe