Navigation

    Logo
    • Register
    • Login
    • Search
    • Recent
    • Tags
    • Unread
    • Categories
    • Unreplied
    • Popular
    • GitHub
    • Docu
    • Hilfe
    1. Home
    2. Deutsch
    3. ioBroker Allgemein
    4. InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler

    NEWS

    • ioBroker@Smart Living Forum Solingen, 14.06. - Agenda added

    • ioBroker goes Matter ... Matter Adapter in Stable

    • Monatsrückblick - April 2025

    InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler

    This topic has been deleted. Only users with topic management privileges can see it.
    • akuhnsh
      akuhnsh @Thomas Braun last edited by

      @thomas-braun

      Terminal läuft unter MacOS, ja.

      1 Reply Last reply Reply Quote 0
      • akuhnsh
        akuhnsh @Thomas Braun last edited by akuhnsh

        @thomas-braun

        @thomas-braun said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

        @akuhnsh

        Ist das ein MacOS?

        Sorry, ich war nicht über SSH mit dem Pi verbunden... Hauaha!

        ======== Start marking the full check here =========

        Skript v.2023-04-15
        
        *** BASE SYSTEM ***
        Model		: Raspberry Pi 4 Model B Rev 1.2
        Architecture    : aarch64
        Docker          : false
        Virtualization  : none
        Distributor ID:	Debian
        Description:	Debian GNU/Linux 11 (bullseye)
        Release:	11
        Codename:	bullseye
        
        PRETTY_NAME="Debian GNU/Linux 11 (bullseye)"
        NAME="Debian GNU/Linux"
        VERSION_ID="11"
        VERSION="11 (bullseye)"
        VERSION_CODENAME=bullseye
        ID=debian
        HOME_URL="https://www.debian.org/"
        SUPPORT_URL="https://www.debian.org/support"
        BUG_REPORT_URL="https://bugs.debian.org/"
        
        Systemuptime and Load:
         12:03:24 up 1 day,  1:06,  1 user,  load average: 0.88, 0.52, 0.25
        CPU threads: 4
        
        Raspberry only:
        throttled=0x0
        Other values than 0x0 hint to temperature/voltage problems
        temp=63.3'C
        volt=0.8438V
        
        *** Time and Time Zones ***
                       Local time: Sat 2023-05-06 12:03:24 CEST
                   Universal time: Sat 2023-05-06 10:03:24 UTC
                         RTC time: n/a
                        Time zone: Europe/Berlin (CEST, +0200)
        System clock synchronized: yes
                      NTP service: active
                  RTC in local TZ: no
        
        *** User and Groups ***
        pi
        /home/pi
        pi adm dialout cdrom sudo audio video plugdev games users input render netdev gpio i2c spi iobroker
        
        *** X-Server-Setup ***
        X-Server: 	false
        Desktop: 	
        Terminal: 	tty
        Boot Target: 	multi-user.target
        
        *** MEMORY ***
                       total        used        free      shared  buff/cache   available
        Mem:            3.8G        1.5G        714M        1.0M        1.6G        2.2G
        Swap:            99M          0B         99M
        Total:          3.9G        1.5G        814M
        
                 3794 M total memory
                 1481 M used memory
                 1738 M active memory
                 1185 M inactive memory
                  714 M free memory
                   83 M buffer memory
                 1515 M swap cache
                   99 M total swap
                    0 M used swap
                   99 M free swap
        
        Raspberry only:
        oom events: 0
        lifetime oom required: 0 Mbytes
        total time in oom handler: 0 ms
        max time spent in oom handler: 0 ms
        
        *** FILESYSTEM ***
        Filesystem     Type      Size  Used Avail Use% Mounted on
        /dev/root      ext4      117G   18G   96G  16% /
        devtmpfs       devtmpfs  1.7G     0  1.7G   0% /dev
        tmpfs          tmpfs     1.9G     0  1.9G   0% /dev/shm
        tmpfs          tmpfs     759M  1.1M  758M   1% /run
        tmpfs          tmpfs     5.0M  4.0K  5.0M   1% /run/lock
        /dev/mmcblk0p1 vfat      255M   32M  224M  13% /boot
        tmpfs          tmpfs     380M     0  380M   0% /run/user/1000
        
        Messages concerning ext4 filesystem in dmesg:
        [Fri May  5 10:57:02 2023] Kernel command line: coherent_pool=1M 8250.nr_uarts=0 snd_bcm2835.enable_headphones=0 snd_bcm2835.enable_headphones=1 snd_bcm2835.enable_hdmi=1 snd_bcm2835.enable_hdmi=0  smsc95xx.macaddr=DC:A6:32:99:1F:06 vc_mem.mem_base=0x3ec00000 vc_mem.mem_size=0x40000000  console=ttyS0,115200 console=tty1 root=PARTUUID=accb09c9-02 rootfstype=ext4 fsck.repair=yes rootwait
        [Fri May  5 10:57:03 2023] EXT4-fs (mmcblk0p2): mounted filesystem with ordered data mode. Quota mode: none.
        [Fri May  5 10:57:03 2023] VFS: Mounted root (ext4 filesystem) readonly on device 179:2.
        [Fri May  5 10:57:05 2023] EXT4-fs (mmcblk0p2): re-mounted. Quota mode: none.
        
        Show mounted filesystems (real ones only):
        TARGET  SOURCE         FSTYPE OPTIONS
        /       /dev/mmcblk0p2 ext4   rw,noatime
        `-/boot /dev/mmcblk0p1 vfat   rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,
        
        Files in neuralgic directories:
        
        /var:
        7.2G	/var/
        4.2G	/var/log
        4.0G	/var/log/journal/16936caa76ee4e328d697dd513277359
        4.0G	/var/log/journal
        2.3G	/var/lib
        
        Archived and active journals take up 3.9G in the file system.
        
        /opt/iobroker/backups:
        7.0G	/opt/iobroker/backups/
        
        /opt/iobroker/iobroker-data:
        196M	/opt/iobroker/iobroker-data/
        153M	/opt/iobroker/iobroker-data/files
        51M	/opt/iobroker/iobroker-data/files/javascript.admin
        48M	/opt/iobroker/iobroker-data/files/telegram.admin
        32M	/opt/iobroker/iobroker-data/files/javascript.admin/static/js
        
        The five largest files in iobroker-data are:
        15M	/opt/iobroker/iobroker-data/files/telegram.admin/rules/static/js/vendors-node_modules_iobroker_adapter-react-v5_assets_devices_parseNames_d_ts-node_modules_io-1d9f06.44fe4a3f.chunk.js.map
        9.5M	/opt/iobroker/iobroker-data/objects.jsonl
        7.2M	/opt/iobroker/iobroker-data/files/telegram.admin/rules/static/js/vendors-node_modules_iobroker_adapter-react-v5_assets_devices_parseNames_d_ts-node_modules_io-1d9f06.44fe4a3f.chunk.js
        6.9M	/opt/iobroker/iobroker-data/files/telegram.admin/custom/static/js/vendors-node_modules_mui_icons-material_esm_index_js.8fdf8cb7.chunk.js.map
        6.9M	/opt/iobroker/iobroker-data/files/javascript.admin/static/js/610.de0231c9.chunk.js.map
        
        *** NodeJS-Installation ***
        
        /usr/bin/nodejs 	v16.19.1
        /usr/bin/node 		v16.19.1
        /usr/bin/npm 		8.19.3
        /usr/bin/npx 		8.19.3
        
        
        nodejs:
          Installed: 16.19.1-deb-1nodesource1
          Candidate: 16.20.0-deb-1nodesource1
          Version table:
             16.20.0-deb-1nodesource1 500
                500 https://deb.nodesource.com/node_16.x bullseye/main arm64 Packages
         *** 16.19.1-deb-1nodesource1 100
                100 /var/lib/dpkg/status
             12.22.12~dfsg-1~deb11u4 500
                500 http://security.debian.org/debian-security bullseye-security/main arm64 Packages
             12.22.12~dfsg-1~deb11u3 500
                500 http://deb.debian.org/debian bullseye/main arm64 Packages
        
        Temp directories causing npm8 problem: 0
        No problems detected
        
        *** ioBroker-Installation ***
        
        ioBroker Status
        iobroker is running on this host.
        
        
        Objects type: jsonl
        States  type: jsonl
        
        MULTIHOSTSERVICE/enabled: false
        
        Core adapters versions
        js-controller: 	4.0.24
        admin: 		6.3.5
        javascript: 	6.1.4
        
        Adapters from github: 	0
        
        Adapter State
        + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
        + system.adapter.backitup.0               : backitup              : iobroker                                 -  enabled
        + system.adapter.cloud.0                  : cloud                 : iobroker                                 -  enabled
        + system.adapter.discovery.0              : discovery             : iobroker                                 -  enabled
        + system.adapter.email.0                  : email                 : iobroker                                 -  enabled
          system.adapter.influxdb.0               : influxdb              : iobroker                                 - disabled, port: 8086
        + system.adapter.javascript.0             : javascript            : iobroker                                 -  enabled
          system.adapter.modbus.0                 : modbus                : iobroker                                 - disabled
          system.adapter.mqtt-client.0            : mqtt-client           : iobroker                                 - disabled, port: 1883
        + system.adapter.mqtt.0                   : mqtt                  : iobroker                                 -  enabled, port: 1886, bind: 0.0.0.0
        + system.adapter.pvforecast.0             : pvforecast            : iobroker                                 -  enabled
        + system.adapter.pvforecast.1             : pvforecast            : iobroker                                 -  enabled
        + system.adapter.sonoff.0                 : sonoff                : iobroker                                 -  enabled, port: 1883, bind: 0.0.0.0
        + system.adapter.sql.0                    : sql                   : iobroker                                 -  enabled
        + system.adapter.telegram.0               : telegram              : iobroker                                 -  enabled, port: 8443, bind: 0.0.0.0
        + system.adapter.tuya.0                   : tuya                  : iobroker                                 -  enabled
          system.adapter.vis.0                    : vis                   : iobroker                                 -  enabled
        + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
        + system.adapter.yahka.0                  : yahka                 : iobroker                                 -  enabled
        
        + instance is alive
        
        Enabled adapters with bindings
        + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
        + system.adapter.mqtt.0                   : mqtt                  : iobroker                                 -  enabled, port: 1886, bind: 0.0.0.0
        + system.adapter.sonoff.0                 : sonoff                : iobroker                                 -  enabled, port: 1883, bind: 0.0.0.0
        + system.adapter.telegram.0               : telegram              : iobroker                                 -  enabled, port: 8443, bind: 0.0.0.0
        + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
        
        ioBroker-Repositories
        stable        : http://download.iobroker.net/sources-dist.json
        beta          : http://download.iobroker.net/sources-dist-latest.json
        
        Active repo(s): stable
        
        Installed ioBroker-Instances
        Used repository: stable
        Adapter    "admin"        : 6.3.5    , installed 6.3.5
        Adapter    "backitup"     : 2.6.19   , installed 2.6.19
        Adapter    "cloud"        : 4.3.0    , installed 4.3.0
        Adapter    "discovery"    : 3.1.0    , installed 3.1.0
        Adapter    "email"        : 1.1.4    , installed 1.1.4
        Adapter    "influxdb"     : 3.2.0    , installed 3.2.0
        Adapter    "javascript"   : 6.1.4    , installed 6.1.4
        Controller "js-controller": 4.0.24   , installed 4.0.24
        Adapter    "modbus"       : 5.0.11   , installed 5.0.11
        Adapter    "mqtt"         : 4.0.7    , installed 4.0.7
        Adapter    "mqtt-client"  : 1.6.3    , installed 1.6.3
        Adapter    "pvforecast"   : 2.3.0    , installed 2.3.0
        Adapter    "simple-api"   : 2.7.2    , installed 2.7.2
        Adapter    "socketio"     : 4.2.0    , installed 4.2.0
        Adapter    "sonoff"       : 2.5.1    , installed 2.5.1
        Adapter    "sql"          : 2.2.0    , installed 2.2.0
        Adapter    "telegram"     : 1.15.2   , installed 1.15.2
        Adapter    "tuya"         : 3.13.1   , installed 3.13.1
        Adapter    "vis"          : 1.4.16   , installed 1.4.16
        Adapter    "web"          : 4.3.0    , installed 4.3.0
        Adapter    "ws"           : 1.3.0    , installed 1.3.0
        Adapter    "yahka"        : 0.17.0   , installed 0.17.0
        
        Objects and States
        Please stand by - This may take a while
        Objects: 	1263
        States: 	1103
        
        *** OS-Repositories and Updates ***
        Hit:1 http://archive.raspberrypi.org/debian bullseye InRelease
        Hit:2 http://deb.debian.org/debian bullseye InRelease                                                                                                       
        Hit:3 http://deb.debian.org/debian bullseye-updates InRelease                                                                                               
        Hit:4 http://security.debian.org/debian-security bullseye-security InRelease                                                                                
        Hit:5 https://deb.nodesource.com/node_16.x bullseye InRelease                                                                                               
        Hit:6 https://repos.influxdata.com/debian stable InRelease                                                                  
        Hit:7 https://packages.grafana.com/oss/deb stable InRelease                 
        Reading package lists... Done                         
        Pending Updates: 40
        
        *** Listening Ports ***
        Active Internet connections (only servers)
        Proto Recv-Q Send-Q Local Address           Foreign Address         State       User       Inode      PID/Program name    
        tcp        0      0 0.0.0.0:1886            0.0.0.0:*               LISTEN      1001       15515      946/io.mqtt.0       
        tcp        0      0 0.0.0.0:1887            0.0.0.0:*               LISTEN      1001       15516      946/io.mqtt.0       
        tcp        0      0 0.0.0.0:1883            0.0.0.0:*               LISTEN      1001       14905      961/io.sonoff.0     
        tcp        0      0 127.0.0.1:9000          0.0.0.0:*               LISTEN      1001       12261      774/iobroker.js-con 
        tcp        0      0 127.0.0.1:9001          0.0.0.0:*               LISTEN      1001       12254      774/iobroker.js-con 
        tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      0          12083      523/sshd: /usr/sbin 
        tcp        0      0 0.0.0.0:39007           0.0.0.0:*               LISTEN      1001       15615      1021/io.yahka.0     
        tcp6       0      0 :::8086                 :::*                    LISTEN      999        1643       649/influxd         
        tcp6       0      0 :::8082                 :::*                    LISTEN      1001       1868       1036/io.web.0       
        tcp6       0      0 :::8081                 :::*                    LISTEN      1001       1686       821/io.admin.0      
        tcp6       0      0 :::22                   :::*                    LISTEN      0          12094      523/sshd: /usr/sbin 
        tcp6       0      0 :::3000                 :::*                    LISTEN      110        15384      645/grafana         
        udp        0      0 0.0.0.0:35976           0.0.0.0:*                           108        12792      396/avahi-daemon: r 
        udp        0      0 0.0.0.0:5353            0.0.0.0:*                           108        12790      396/avahi-daemon: r 
        udp        0      0 0.0.0.0:68              0.0.0.0:*                           0          12826      644/dhcpcd          
        udp        0      0 0.0.0.0:6666            0.0.0.0:*                           1001       1789       976/io.tuya.0       
        udp        0      0 0.0.0.0:6667            0.0.0.0:*                           1001       1791       976/io.tuya.0       
        udp6       0      0 :::5353                 :::*                                108        12791      396/avahi-daemon: r 
        udp6       0      0 :::35029                :::*                                108        12793      396/avahi-daemon: r 
        
        *** Log File - Last 25 Lines ***
        
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:     code: 'internal error',
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:     message: 'unexpected error writing points to database: timeout'
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   },
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   code: 'internal error',
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   _retryAfter: 0
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]: }
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]: WARN: Write to InfluxDB failed (attempt: 1). m [HttpError]: unexpected error writing points to database: timeout
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.<anonymous> (/opt/iobroker/node_modules/@influxdata/influxdb-client/src/impl/node/NodeHttpTransport.ts:354:11)
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.emit (node:events:525:35)
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.emit (node:domain:489:12)
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at endReadableNT (node:internal/streams/readable:1358:12)
        2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at processTicksAndRejections (node:internal/process/task_queues:83:21) {
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   statusCode: 500,
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   statusMessage: 'Internal Server Error',
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   body: '{"code":"internal error","message":"unexpected error writing points to database: timeout"}',
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   contentType: 'application/json; charset=utf-8',
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   json: {
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:     code: 'internal error',
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:     message: 'unexpected error writing points to database: timeout'
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   },
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   code: 'internal error',
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   _retryAfter: 0
        2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]: }
        2023-05-06 11:27:15.358  - info: host.iobroker instance system.adapter.influxdb.0 terminated with code 0 (NO_ERROR)
        2023-05-06 11:27:15.359  - info: host.iobroker Do not restart adapter system.adapter.influxdb.0 because disabled or deleted
        
        

        ============ Mark until here for C&P =============

        ======================= SUMMARY =======================
        		     v.2023-04-15
        
        
        Operatingsystem: 	Debian GNU/Linux 11 (bullseye)
        Kernel: 		6.1.21-v8+
        Installation: 		Native
        Timezone: 		Europe/Berlin (CEST, +0200)
        User-ID: 		1000
        X-Server: 		false
        Boot Target: 		multi-user.target
        
        Pending OS-Updates: 	40
        Pending iob updates: 	0
        
        Nodejs-Installation: 	/usr/bin/nodejs 	v16.19.1
        			/usr/bin/node 		v16.19.1
        			/usr/bin/npm 		8.19.3
        			/usr/bin/npx 		8.19.3
        
        Recommended versions are nodejs 18.x.y and npm 9.x.y
        Your nodejs installation is correct
        
        MEMORY: 
                       total        used        free      shared  buff/cache   available
        Mem:            3.8G        1.5G        711M        1.0M        1.6G        2.2G
        Swap:            99M          0B         99M
        Total:          3.9G        1.5G        811M
        
        Active iob-Instances: 	16
        Active repo(s): stable
        
        ioBroker Core: 		js-controller 		4.0.24
        			admin 			6.3.5
        
        ioBroker Status: 	iobroker is running on this host.
        
        
        Objects type: jsonl
        States  type: jsonl
        
        Status admin and web instance:
        + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
        + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
        
        Objects: 		1263
        States: 		1103
        
        Size of iob-Database:
        
        9.5M	/opt/iobroker/iobroker-data/objects.jsonl
        4.9M	/opt/iobroker/iobroker-data/states.jsonl
        
        
        
        =================== END OF SUMMARY ====================
        
        DJMarc75 1 Reply Last reply Reply Quote 0
        • DJMarc75
          DJMarc75 @akuhnsh last edited by

          @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

          Pending OS-Updates: 40

          mach das mal aktuell 😉

          akuhnsh 1 Reply Last reply Reply Quote 0
          • akuhnsh
            akuhnsh @DJMarc75 last edited by

            @djmarc75

            Ist erledigt...

            Die InfluxDB Instanz hab ich aufgrund der andauernden Fehler vorerst deaktiviert.

            Skript v.2023-04-15
            
            *** BASE SYSTEM ***
            Model		: Raspberry Pi 4 Model B Rev 1.2
            Architecture    : aarch64
            Docker          : false
            Virtualization  : none
            Distributor ID:	Debian
            Description:	Debian GNU/Linux 11 (bullseye)
            Release:	11
            Codename:	bullseye
            
            PRETTY_NAME="Debian GNU/Linux 11 (bullseye)"
            NAME="Debian GNU/Linux"
            VERSION_ID="11"
            VERSION="11 (bullseye)"
            VERSION_CODENAME=bullseye
            ID=debian
            HOME_URL="https://www.debian.org/"
            SUPPORT_URL="https://www.debian.org/support"
            BUG_REPORT_URL="https://bugs.debian.org/"
            
            Systemuptime and Load:
             12:39:31 up 1 day,  1:42,  1 user,  load average: 2.29, 0.99, 0.48
            CPU threads: 4
            
            Raspberry only:
            throttled=0x0
            Other values than 0x0 hint to temperature/voltage problems
            temp=66.2'C
            volt=0.8438V
            
            *** Time and Time Zones ***
                           Local time: Sat 2023-05-06 12:39:32 CEST
                       Universal time: Sat 2023-05-06 10:39:32 UTC
                             RTC time: n/a
                            Time zone: Europe/Berlin (CEST, +0200)
            System clock synchronized: yes
                          NTP service: active
                      RTC in local TZ: no
            
            *** User and Groups ***
            pi
            /home/pi
            pi adm dialout cdrom sudo audio video plugdev games users input render netdev gpio i2c spi iobroker
            
            *** X-Server-Setup ***
            X-Server: 	false
            Desktop: 	
            Terminal: 	tty
            Boot Target: 	multi-user.target
            
            *** MEMORY ***
                           total        used        free      shared  buff/cache   available
            Mem:            3.8G        1.5G        200M        1.0M        2.1G        2.3G
            Swap:            99M          0B         99M
            Total:          3.9G        1.5G        300M
            
                     3794 M total memory
                     1459 M used memory
                     1651 M active memory
                     1741 M inactive memory
                      200 M free memory
                       99 M buffer memory
                     2034 M swap cache
                       99 M total swap
                        0 M used swap
                       99 M free swap
            
            Raspberry only:
            oom events: 0
            lifetime oom required: 0 Mbytes
            total time in oom handler: 0 ms
            max time spent in oom handler: 0 ms
            
            *** FILESYSTEM ***
            Filesystem     Type      Size  Used Avail Use% Mounted on
            /dev/root      ext4      117G   18G   95G  16% /
            devtmpfs       devtmpfs  1.7G     0  1.7G   0% /dev
            tmpfs          tmpfs     1.9G     0  1.9G   0% /dev/shm
            tmpfs          tmpfs     759M  1.1M  758M   1% /run
            tmpfs          tmpfs     5.0M  4.0K  5.0M   1% /run/lock
            /dev/mmcblk0p1 vfat      255M   32M  224M  13% /boot
            tmpfs          tmpfs     380M     0  380M   0% /run/user/1000
            
            Messages concerning ext4 filesystem in dmesg:
            [Fri May  5 10:57:02 2023] Kernel command line: coherent_pool=1M 8250.nr_uarts=0 snd_bcm2835.enable_headphones=0 snd_bcm2835.enable_headphones=1 snd_bcm2835.enable_hdmi=1 snd_bcm2835.enable_hdmi=0  smsc95xx.macaddr=DC:A6:32:99:1F:06 vc_mem.mem_base=0x3ec00000 vc_mem.mem_size=0x40000000  console=ttyS0,115200 console=tty1 root=PARTUUID=accb09c9-02 rootfstype=ext4 fsck.repair=yes rootwait
            [Fri May  5 10:57:03 2023] EXT4-fs (mmcblk0p2): mounted filesystem with ordered data mode. Quota mode: none.
            [Fri May  5 10:57:03 2023] VFS: Mounted root (ext4 filesystem) readonly on device 179:2.
            [Fri May  5 10:57:05 2023] EXT4-fs (mmcblk0p2): re-mounted. Quota mode: none.
            
            Show mounted filesystems (real ones only):
            TARGET  SOURCE         FSTYPE OPTIONS
            /       /dev/mmcblk0p2 ext4   rw,noatime
            `-/boot /dev/mmcblk0p1 vfat   rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro
            
            Files in neuralgic directories:
            
            /var:
            7.4G	/var/
            4.2G	/var/log
            4.0G	/var/log/journal/16936caa76ee4e328d697dd513277359
            4.0G	/var/log/journal
            2.3G	/var/lib
            
            Archived and active journals take up 3.9G in the file system.
            
            /opt/iobroker/backups:
            7.0G	/opt/iobroker/backups/
            
            /opt/iobroker/iobroker-data:
            199M	/opt/iobroker/iobroker-data/
            153M	/opt/iobroker/iobroker-data/files
            51M	/opt/iobroker/iobroker-data/files/javascript.admin
            48M	/opt/iobroker/iobroker-data/files/telegram.admin
            32M	/opt/iobroker/iobroker-data/files/javascript.admin/static/js
            
            The five largest files in iobroker-data are:
            15M	/opt/iobroker/iobroker-data/files/telegram.admin/rules/static/js/vendors-node_modules_iobroker_adapter-react-v5_assets_devices_parseNames_d_ts-node_modules_io-1d9f06.44fe4a3f.chunk.js.map
            9.5M	/opt/iobroker/iobroker-data/objects.jsonl
            7.3M	/opt/iobroker/iobroker-data/states.jsonl
            7.2M	/opt/iobroker/iobroker-data/files/telegram.admin/rules/static/js/vendors-node_modules_iobroker_adapter-react-v5_assets_devices_parseNames_d_ts-node_modules_io-1d9f06.44fe4a3f.chunk.js
            6.9M	/opt/iobroker/iobroker-data/files/telegram.admin/custom/static/js/vendors-node_modules_mui_icons-material_esm_index_js.8fdf8cb7.chunk.js.map
            
            *** NodeJS-Installation ***
            
            /usr/bin/nodejs 	v16.20.0
            /usr/bin/node 		v16.20.0
            /usr/bin/npm 		8.19.4
            /usr/bin/npx 		8.19.4
            
            
            nodejs:
              Installed: 16.20.0-deb-1nodesource1
              Candidate: 16.20.0-deb-1nodesource1
              Version table:
             *** 16.20.0-deb-1nodesource1 500
                    500 https://deb.nodesource.com/node_16.x bullseye/main arm64 Packages
                    100 /var/lib/dpkg/status
                 12.22.12~dfsg-1~deb11u4 500
                    500 http://security.debian.org/debian-security bullseye-security/main arm64 Packages
                 12.22.12~dfsg-1~deb11u3 500
                    500 http://deb.debian.org/debian bullseye/main arm64 Packages
            
            Temp directories causing npm8 problem: 0
            No problems detected
            
            *** ioBroker-Installation ***
            
            ioBroker Status
            iobroker is running on this host.
            
            
            Objects type: jsonl
            States  type: jsonl
            
            MULTIHOSTSERVICE/enabled: false
            
            Core adapters versions
            js-controller: 	4.0.24
            admin: 		6.3.5
            javascript: 	6.1.4
            
            Adapters from github: 	0
            
            Adapter State
            + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
            + system.adapter.backitup.0               : backitup              : iobroker                                 -  enabled
            + system.adapter.cloud.0                  : cloud                 : iobroker                                 -  enabled
            + system.adapter.discovery.0              : discovery             : iobroker                                 -  enabled
            + system.adapter.email.0                  : email                 : iobroker                                 -  enabled
              system.adapter.influxdb.0               : influxdb              : iobroker                                 - disabled, port: 8086
            + system.adapter.javascript.0             : javascript            : iobroker                                 -  enabled
              system.adapter.modbus.0                 : modbus                : iobroker                                 - disabled
              system.adapter.mqtt-client.0            : mqtt-client           : iobroker                                 - disabled, port: 1883
            + system.adapter.mqtt.0                   : mqtt                  : iobroker                                 -  enabled, port: 1886, bind: 0.0.0.0
            + system.adapter.pvforecast.0             : pvforecast            : iobroker                                 -  enabled
            + system.adapter.pvforecast.1             : pvforecast            : iobroker                                 -  enabled
            + system.adapter.sonoff.0                 : sonoff                : iobroker                                 -  enabled, port: 1883, bind: 0.0.0.0
            + system.adapter.sql.0                    : sql                   : iobroker                                 -  enabled
            + system.adapter.telegram.0               : telegram              : iobroker                                 -  enabled, port: 8443, bind: 0.0.0.0
            + system.adapter.tuya.0                   : tuya                  : iobroker                                 -  enabled
              system.adapter.vis.0                    : vis                   : iobroker                                 -  enabled
            + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
            + system.adapter.yahka.0                  : yahka                 : iobroker                                 -  enabled
            
            + instance is alive
            
            Enabled adapters with bindings
            + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
            + system.adapter.mqtt.0                   : mqtt                  : iobroker                                 -  enabled, port: 1886, bind: 0.0.0.0
            + system.adapter.sonoff.0                 : sonoff                : iobroker                                 -  enabled, port: 1883, bind: 0.0.0.0
            + system.adapter.telegram.0               : telegram              : iobroker                                 -  enabled, port: 8443, bind: 0.0.0.0
            + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
            
            ioBroker-Repositories
            stable        : http://download.iobroker.net/sources-dist.json
            beta          : http://download.iobroker.net/sources-dist-latest.json
            
            Active repo(s): stable
            
            Installed ioBroker-Instances
            Used repository: stable
            Adapter    "admin"        : 6.3.5    , installed 6.3.5
            Adapter    "backitup"     : 2.6.19   , installed 2.6.19
            Adapter    "cloud"        : 4.3.0    , installed 4.3.0
            Adapter    "discovery"    : 3.1.0    , installed 3.1.0
            Adapter    "email"        : 1.1.4    , installed 1.1.4
            Adapter    "influxdb"     : 3.2.0    , installed 3.2.0
            Adapter    "javascript"   : 6.1.4    , installed 6.1.4
            Controller "js-controller": 4.0.24   , installed 4.0.24
            Adapter    "modbus"       : 5.0.11   , installed 5.0.11
            Adapter    "mqtt"         : 4.0.7    , installed 4.0.7
            Adapter    "mqtt-client"  : 1.6.3    , installed 1.6.3
            Adapter    "pvforecast"   : 2.3.0    , installed 2.3.0
            Adapter    "simple-api"   : 2.7.2    , installed 2.7.2
            Adapter    "socketio"     : 4.2.0    , installed 4.2.0
            Adapter    "sonoff"       : 2.5.1    , installed 2.5.1
            Adapter    "sql"          : 2.2.0    , installed 2.2.0
            Adapter    "telegram"     : 1.15.2   , installed 1.15.2
            Adapter    "tuya"         : 3.13.1   , installed 3.13.1
            Adapter    "vis"          : 1.4.16   , installed 1.4.16
            Adapter    "web"          : 4.3.0    , installed 4.3.0
            Adapter    "ws"           : 1.3.0    , installed 1.3.0
            Adapter    "yahka"        : 0.17.0   , installed 0.17.0
            
            Objects and States
            Please stand by - This may take a while
            Objects: 	1263
            States: 	1103
            
            *** OS-Repositories and Updates ***
            Hit:1 http://security.debian.org/debian-security bullseye-security InRelease
            Hit:2 http://deb.debian.org/debian bullseye InRelease                                                                                                       
            Hit:3 http://deb.debian.org/debian bullseye-updates InRelease                                                                                               
            Hit:4 https://deb.nodesource.com/node_16.x bullseye InRelease                                                                                               
            Hit:5 http://archive.raspberrypi.org/debian bullseye InRelease                                                                                              
            Hit:6 https://repos.influxdata.com/debian stable InRelease                                                                                                  
            Hit:7 https://packages.grafana.com/oss/deb stable InRelease                 
            Reading package lists... Done                         
            Pending Updates: 0
            
            *** Listening Ports ***
            Active Internet connections (only servers)
            Proto Recv-Q Send-Q Local Address           Foreign Address         State       User       Inode      PID/Program name    
            tcp        0      0 0.0.0.0:1886            0.0.0.0:*               LISTEN      1001       15515      946/io.mqtt.0       
            tcp        0      0 0.0.0.0:1887            0.0.0.0:*               LISTEN      1001       15516      946/io.mqtt.0       
            tcp        0      0 0.0.0.0:1883            0.0.0.0:*               LISTEN      1001       14905      961/io.sonoff.0     
            tcp        0      0 127.0.0.1:9000          0.0.0.0:*               LISTEN      1001       12261      774/iobroker.js-con 
            tcp        0      0 127.0.0.1:9001          0.0.0.0:*               LISTEN      1001       12254      774/iobroker.js-con 
            tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      0          12083      523/sshd: /usr/sbin 
            tcp        0      0 0.0.0.0:39007           0.0.0.0:*               LISTEN      1001       15615      1021/io.yahka.0     
            tcp6       0      0 :::8086                 :::*                    LISTEN      999        1643       649/influxd         
            tcp6       0      0 :::8082                 :::*                    LISTEN      1001       1868       1036/io.web.0       
            tcp6       0      0 :::8081                 :::*                    LISTEN      1001       1686       821/io.admin.0      
            tcp6       0      0 :::22                   :::*                    LISTEN      0          12094      523/sshd: /usr/sbin 
            tcp6       0      0 :::3000                 :::*                    LISTEN      110        1837583    15453/grafana       
            udp        0      0 0.0.0.0:41614           0.0.0.0:*                           108        1837323    15544/avahi-daemon: 
            udp        0      0 0.0.0.0:5353            0.0.0.0:*                           108        1837321    15544/avahi-daemon: 
            udp        0      0 0.0.0.0:68              0.0.0.0:*                           0          12826      644/dhcpcd          
            udp        0      0 0.0.0.0:6666            0.0.0.0:*                           1001       1789       976/io.tuya.0       
            udp        0      0 0.0.0.0:6667            0.0.0.0:*                           1001       1791       976/io.tuya.0       
            udp6       0      0 :::5353                 :::*                                108        1837322    15544/avahi-daemon: 
            udp6       0      0 :::52551                :::*                                108        1837324    15544/avahi-daemon: 
            
            *** Log File - Last 25 Lines ***
            
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:     code: 'internal error',
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:     message: 'unexpected error writing points to database: timeout'
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   },
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   code: 'internal error',
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]:   _retryAfter: 0
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[518]: }
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]: WARN: Write to InfluxDB failed (attempt: 1). m [HttpError]: unexpected error writing points to database: timeout
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.<anonymous> (/opt/iobroker/node_modules/@influxdata/influxdb-client/src/impl/node/NodeHttpTransport.ts:354:11)
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.emit (node:events:525:35)
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at IncomingMessage.emit (node:domain:489:12)
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at endReadableNT (node:internal/streams/readable:1358:12)
            2023-05-06 11:27:15.357  - error: host.iobroker Caught by controller[519]:     at processTicksAndRejections (node:internal/process/task_queues:83:21) {
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   statusCode: 500,
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   statusMessage: 'Internal Server Error',
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   body: '{"code":"internal error","message":"unexpected error writing points to database: timeout"}',
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   contentType: 'application/json; charset=utf-8',
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   json: {
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:     code: 'internal error',
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:     message: 'unexpected error writing points to database: timeout'
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   },
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   code: 'internal error',
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]:   _retryAfter: 0
            2023-05-06 11:27:15.358  - error: host.iobroker Caught by controller[519]: }
            2023-05-06 11:27:15.358  - info: host.iobroker instance system.adapter.influxdb.0 terminated with code 0 (NO_ERROR)
            2023-05-06 11:27:15.359  - info: host.iobroker Do not restart adapter system.adapter.influxdb.0 because disabled or deleted
            
            
            ======================= SUMMARY =======================
            		     v.2023-04-15
            
            
            Operatingsystem: 	Debian GNU/Linux 11 (bullseye)
            Kernel: 		6.1.21-v8+
            Installation: 		Native
            Timezone: 		Europe/Berlin (CEST, +0200)
            User-ID: 		1000
            X-Server: 		false
            Boot Target: 		multi-user.target
            
            Pending OS-Updates: 	0
            Pending iob updates: 	0
            
            Nodejs-Installation: 	/usr/bin/nodejs 	v16.20.0
            			/usr/bin/node 		v16.20.0
            			/usr/bin/npm 		8.19.4
            			/usr/bin/npx 		8.19.4
            
            Recommended versions are nodejs 18.x.y and npm 9.x.y
            Your nodejs installation is correct
            
            MEMORY: 
                           total        used        free      shared  buff/cache   available
            Mem:            3.8G        1.4G        416M        1.0M        1.9G        2.3G
            Swap:            99M          0B         99M
            Total:          3.9G        1.4G        516M
            
            Active iob-Instances: 	16
            Active repo(s): stable
            
            ioBroker Core: 		js-controller 		4.0.24
            			admin 			6.3.5
            
            ioBroker Status: 	iobroker is running on this host.
            
            
            Objects type: jsonl
            States  type: jsonl
            
            Status admin and web instance:
            + system.adapter.admin.0                  : admin                 : iobroker                                 -  enabled, port: 8081, bind: 0.0.0.0, run as: admin
            + system.adapter.web.0                    : web                   : iobroker                                 -  enabled, port: 8082, bind: 0.0.0.0, run as: admin
            
            Objects: 		1263
            States: 		1103
            
            Size of iob-Database:
            
            9.5M	/opt/iobroker/iobroker-data/objects.jsonl
            7.6M	/opt/iobroker/iobroker-data/states.jsonl
            
            
            
            =================== END OF SUMMARY ====================
            
            1 Reply Last reply Reply Quote 0
            • Marc Berg
              Marc Berg Most Active last edited by

              @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

              Zu der Zeit lief kein Rechner zu Hause und es war auch niemand anwesend. Das Desaster ist einfach vom Himmel gefallen und ich habe keine Ahnung wie ich meine Datenbank wiederherstellen und so etwas in Zukunft vermeiden kann.

              Wenn ich die Logs richtig lese, hast du alles auf SD Card installiert? Das würde ich für ein System, welches mir wichtig ist, nicht so machen. Die Vermutung liegt nahe, dass es dir das Filesystem zerschossen hat. Eine InfluxDB stürzt eigentlich nicht einfach so ab.

              Hat jemand eine Idee und Step by Step Vorgehensweise für mich?

              Das bekommen wir wieder hin, falls das Backup funktionsfähig ist. Wenn sich die InfluxDB "vergessen" hat, dann hast du auch einen neuen Token, den du in Backitup eingeben musst. Vorher würde ich aber die Ursache abklären (lassen).

              akuhnsh 1 Reply Last reply Reply Quote 0
              • akuhnsh
                akuhnsh @Marc Berg last edited by

                @marc-berg said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                Wenn ich die Logs richtig lese, hast du alles auf SD Card installiert? Das würde ich für ein System, welches mir wichtig ist, nicht so machen. Die Vermutung liegt nahe, dass es dir das Filesystem zerschossen hat. Eine InfluxDB stürzt eigentlich nicht einfach so ab.

                Über welches Medium machst du es? Mir ist es wichtig und eine kleine HDD über USB wäre ein Gedanke der mir dazu gekommen ist. Funktioniert das mit Power over USB beim Pi?

                Das bekommen wir wieder hin, falls das Backup funktionsfähig ist. Wenn sich die InfluxDB "vergessen" hat, dann hast du auch einen neuen Token, den du in Backitup eingeben musst. Vorher würde ich aber die Ursache abklären (lassen).

                BackItUp hat den neuen Token und wie erwähnt, habe ich Influx wieder so eingerichtet, wie es vorher war. Bucket gelöscht bzw. umbenannt und dann über BackItUp die Wiederherstellung gestartet. Gerade nochmal probiert:

                Started restore ...
                [DEBUG] [influxDB] - Created tmp directory
                [DEBUG] [influxDB] - Start infuxDB Restore ...
                [DEBUG] [influxDB] - influxdb.0 is stopped
                [ERROR] [influxDB] - Error: failed to check existence of organization "privat": 401 Unauthorized: unauthorized access
                
                [DEBUG] [influxDB] - Try deleting the InfluxDB tmp directory
                [DEBUG] [influxDB] - InfluxDB tmp directory was successfully deleted
                [DEBUG] [influxDB] - infuxDB Restore completed successfully
                [EXIT] influxDB restore done
                [DEBUG] [influxDB] - influxdb.0 started
                
                Homoran 1 Reply Last reply Reply Quote 0
                • Marc Berg
                  Marc Berg Most Active last edited by

                  @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                  Über welches Medium machst du es? Mir ist es wichtig und eine kleine HDD über USB wäre ein Gedanke der mir dazu gekommen ist. Funktioniert das mit Power over USB beim Pi?

                  Dazu bin ich leider keine Hilfe. Ich habe das mal probiert mit der SSD über USB, aber entnervt abgebrochen, weil es da einige Abhängigkeiten bzgl. des passenden Adapters etc. gibt. Ich arbeite seit einiger Zeit auf einem NUC und bin damit zufrieden.

                  BackItUp hat den neuen Token und wie erwähnt, habe ich Influx wieder so eingerichtet, wie es vorher war. Bucket gelöscht bzw. umbenannt und dann über BackItUp die Wiederherstellung gestartet. Gerade nochmal probiert:

                  Wo hast du den Token her?

                  akuhnsh 1 Reply Last reply Reply Quote 0
                  • Homoran
                    Homoran Global Moderator Administrators @akuhnsh last edited by

                    @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                    Funktioniert das mit Power over USB beim Pi?

                    Da gibt es einige Fallstricke

                    • Das Netzteil des Pi muss stark genug sein
                    • Beim Pi4 gibt es Störeinstrahlungen in den USB3
                      • Der USB2 liefert möglicherweise nicht genug Power

                    USB2 mit aktivem Hub wäre dann eine Lösung

                    1 Reply Last reply Reply Quote 0
                    • akuhnsh
                      akuhnsh @Marc Berg last edited by

                      @marc-berg said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                      Wo hast du den Token her?

                      Ich hab InfluxDB nach dem Willkommensschirm über GUI eingerichtet wie es zuvor war und mir dann das neue admin Token kopiert und in BackItUp sowie in der Influx Instanz hinterlegt. Verbindungstest dort ist i.O. Werte schreibt er dennoch nicht in die DB

                      influxdb.0
                      2023-05-06 13:13:38.511	info	Add point that had error for mqtt.0.PL1 to buffer again, error-count=4
                      
                      influxdb.0
                      2023-05-06 13:13:38.510	warn	Error on writePoint("{"value":196.79,"time":"2023-05-06T11:12:38.280Z","from":"system.adapter.mqtt.0","q":0,"ack":true}): HttpError: unexpected error writing points to database: timeout / "unexpected error writing points to database: timeout""
                      
                      influxdb.0
                      2023-05-06 13:13:38.510	warn	Point could not be written to database: iobroker
                      
                      Marc Berg 1 Reply Last reply Reply Quote 0
                      • Marc Berg
                        Marc Berg Most Active @akuhnsh last edited by

                        @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                        Ich hab InfluxDB nach dem Willkommensschirm über GUI eingerichtet wie es zuvor war und mir dann das neue admin Token kopiert und in BackItUp sowie in der Influx Instanz hinterlegt. Verbindungstest dort ist i.O. Werte schreibt er dennoch nicht in die DB

                        influxdb.0
                        2023-05-06 13:13:38.511	info	Add point that had error for mqtt.0.PL1 to buffer again, error-count=4
                        
                        influxdb.0
                        2023-05-06 13:13:38.510	warn	Error on writePoint("{"value":196.79,"time":"2023-05-06T11:12:38.280Z","from":"system.adapter.mqtt.0","q":0,"ack":true}): HttpError: unexpected error writing points to database: timeout / "unexpected error writing points to database: timeout""
                        
                        influxdb.0
                        2023-05-06 13:13:38.510	warn	Point could not be written to database: iobroker
                        

                        Hier kommt aber ein Timeout und kein Autorisierungfehler. Das sieht für mich noch nach anderen Problemen aus.

                        akuhnsh 1 Reply Last reply Reply Quote 0
                        • akuhnsh
                          akuhnsh @Marc Berg last edited by

                          @marc-berg @thomas-braun

                          @akuhnsh said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                          Ich habe das System neu aufgesetzt, um zu sehen, ob sich die Backups dann einspielen lassen und nach wie vor keine Chance die Datenbank aus dem Backup wiederherzustellen:

                          [ERROR] [influxDB] - Error: failed to check existence of organization "privat": 401 Unauthorized: unauthorized access
                          
                          Marc Berg 1 Reply Last reply Reply Quote 0
                          • Marc Berg
                            Marc Berg Most Active @akuhnsh last edited by Marc Berg

                            @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                            @marc-berg @thomas-braun

                            @akuhnsh said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                            Ich habe das System neu aufgesetzt, um zu sehen, ob sich die Backups dann einspielen lassen und nach wie vor keine Chance die Datenbank aus dem Backup wiederherzustellen:

                            [ERROR] [influxDB] - Error: failed to check existence of organization "privat": 401 Unauthorized: unauthorized access
                            

                            zeig mal bitte die Ausgabe von

                            influx auth list --json
                            

                            und

                            influx organization list
                            
                            akuhnsh 1 Reply Last reply Reply Quote 0
                            • akuhnsh
                              akuhnsh @Marc Berg last edited by

                              @marc-berg said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                              @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                              @marc-berg @thomas-braun

                              @akuhnsh said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                              Ich habe das System neu aufgesetzt, um zu sehen, ob sich die Backups dann einspielen lassen und nach wie vor keine Chance die Datenbank aus dem Backup wiederherzustellen:

                              [ERROR] [influxDB] - Error: failed to check existence of organization "privat": 401 Unauthorized: unauthorized access
                              

                              zeig mal bitte die Ausgabe von

                              influx auth list --json
                              
                              [
                              	{
                              		"id": "0b28411ba4dca000",
                              		"description": "admin's Token",
                              		"token": "wljQN6vtZ1a4hNNORVh55RM2u0o1V3XXh708y7yKPmtdBIV5tG5hxucGRGnBX1IdQoYh9Mnd5dsbBBV551nDUw==",
                              		"status": "active",
                              		"userName": "admin",
                              		"userID": "0b28411b6b9ca000",
                              		"permissions": [
                              			"read:/authorizations",
                              			"write:/authorizations",
                              			"read:/buckets",
                              			"write:/buckets",
                              			"read:/dashboards",
                              			"write:/dashboards",
                              			"read:/orgs",
                              			"write:/orgs",
                              			"read:/sources",
                              			"write:/sources",
                              			"read:/tasks",
                              			"write:/tasks",
                              			"read:/telegrafs",
                              			"write:/telegrafs",
                              			"read:/users",
                              			"write:/users",
                              			"read:/variables",
                              			"write:/variables",
                              			"read:/scrapers",
                              			"write:/scrapers",
                              			"read:/secrets",
                              			"write:/secrets",
                              			"read:/labels",
                              			"write:/labels",
                              			"read:/views",
                              			"write:/views",
                              			"read:/documents",
                              			"write:/documents",
                              			"read:/notificationRules",
                              			"write:/notificationRules",
                              			"read:/notificationEndpoints",
                              			"write:/notificationEndpoints",
                              			"read:/checks",
                              			"write:/checks",
                              			"read:/dbrp",
                              			"write:/dbrp",
                              			"read:/notebooks",
                              			"write:/notebooks",
                              			"read:/annotations",
                              			"write:/annotations",
                              			"read:/remotes",
                              			"write:/remotes",
                              			"read:/replications",
                              			"write:/replications"
                              		]
                              	},
                              	{
                              		"id": "0b2852f1fbdca000",
                              		"description": "admin's Token (cloned at 2023-05-06 13:44:19)",
                              		"token": "uZKGetlcGZSZbteI1gwpkJwsCIHpj0bI5ZLdRTvEuDqds7B9X04Q8LKhPYQArrWRJ3nnQS3xfQcwDkwhOKKoCw==",
                              		"status": "active",
                              		"userName": "admin",
                              		"userID": "0b28411b6b9ca000",
                              		"permissions": [
                              			"read:/authorizations",
                              			"write:/authorizations",
                              			"read:/buckets",
                              			"write:/buckets",
                              			"read:/dashboards",
                              			"write:/dashboards",
                              			"read:/orgs",
                              			"write:/orgs",
                              			"read:/sources",
                              			"write:/sources",
                              			"read:/tasks",
                              			"write:/tasks",
                              			"read:/telegrafs",
                              			"write:/telegrafs",
                              			"read:/users",
                              			"write:/users",
                              			"read:/variables",
                              			"write:/variables",
                              			"read:/scrapers",
                              			"write:/scrapers",
                              			"read:/secrets",
                              			"write:/secrets",
                              			"read:/labels",
                              			"write:/labels",
                              			"read:/views",
                              			"write:/views",
                              			"read:/documents",
                              			"write:/documents",
                              			"read:/notificationRules",
                              			"write:/notificationRules",
                              			"read:/notificationEndpoints",
                              			"write:/notificationEndpoints",
                              			"read:/checks",
                              			"write:/checks",
                              			"read:/dbrp",
                              			"write:/dbrp",
                              			"read:/notebooks",
                              			"write:/notebooks",
                              			"read:/annotations",
                              			"write:/annotations",
                              			"read:/remotes",
                              			"write:/remotes",
                              			"read:/replications",
                              			"write:/replications"
                              		]
                              	}
                              ]
                              

                              und

                              influx organization list
                              
                              ID			Name
                              2ecf2d695d1722e9	privat
                              
                              akuhnsh 1 Reply Last reply Reply Quote 0
                              • akuhnsh
                                akuhnsh @akuhnsh last edited by

                                @Marc-Berg @Thomas-Braun

                                An die alten Tokens aus dem Backup komme ich über die *.bolt Datei ran:

                                influxd recovery auth list --bolt-path /opt/iobroker/backups/20230503T213019Z.bolt
                                2023-05-06T14:09:35.271809Z	info	Resources opened	{"log_id": "0hdMBf9l000", "system": "bolt-kvstore", "path": "/opt/iobroker/backups/20230503T213019Z.bolt"}
                                ID			User Name	User ID			Description						Token							Permissions
                                0ab87d8e68295000	admin		0ab87d8e29e95000	admin's Token						lGhesEEm7Osc8iF3Atk0ukEZhEOLdg98AQXojtHwao1MrEOYiv0uaa3BrxS-QXp3T54dXJgDECIu7zuByns0Ug==	[read:authorizations write:authorizations read:buckets write:buckets read:dashboards write:dashboards read:orgs write:orgs read:sources write:sources read:tasks write:tasks read:telegrafs write:telegrafs read:users write:users read:variables write:variables read:scrapers write:scrapers read:secrets write:secrets read:labels write:labels read:views write:views read:documents write:documents read:notificationRules write:notificationRules read:notificationEndpoints write:notificationEndpoints read:checks write:checks read:dbrp write:dbrp read:notebooks write:notebooks read:annotations write:annotations read:remotes write:remotes read:replications write:replications]
                                0b07e2e2b57cb000	admin		0ab87d8e29e95000	admin's Token (cloned at 2023-04-11 09:03:19)		06eJGdVPZPMsCad0sRvWS32YvB-F3Fifn3nTCM0oNBzDaY5I7Q0n-St-82r9vj1dZRnOONeIiWmTFDYIZoe3Gg==	[read:authorizations write:authorizations read:buckets write:buckets read:dashboards write:dashboards read:orgs write:orgs read:sources write:sources read:tasks write:tasks read:telegrafs write:telegrafs read:users write:users read:variables write:variables read:scrapers write:scrapers read:secrets write:secrets read:labels write:labels read:views write:views read:documents write:documents read:notificationRules write:notificationRules read:notificationEndpoints write:notificationEndpoints read:checks write:checks read:dbrp write:dbrp read:notebooks write:notebooks read:annotations write:annotations read:remotes write:remotes read:replications write:replications]
                                0b07e9d2857cb000	admin		0ab87d8e29e95000	admin's Token BackItUp (cloned at 2023-04-11 09:33:38)	Tvm2bJ9-WsIoyNowWCasjc2ukqsFTIj3xQjp_q_1U-sPb6sH21n35nUzV99EOJ3JE2vLKKwCPIrN8jtT4xcT4A==	[read:authorizations write:authorizations read:buckets write:buckets read:dashboards write:dashboards read:orgs write:orgs read:sources write:sources read:tasks write:tasks read:telegrafs write:telegrafs read:users write:users read:variables write:variables read:scrapers write:scrapers read:secrets write:secrets read:labels write:labels read:views write:views read:documents write:documents read:notificationRules write:notificationRules read:notificationEndpoints write:notificationEndpoints read:checks write:checks read:dbrp write:dbrp read:notebooks write:notebooks read:annotations write:annotations read:remotes write:remotes read:replications write:replications]
                                0b07f463437cb000	admin		0ab87d8e29e95000	admin's Token Grafana (cloned at 2023-04-11 10:19:47)	AQLf8Jld1fY0LYd3Cs0Pt4hKtzMbXK0t0_kMdh6eXzBxHiPJWR1t5OYHr7Qvx4-4zf1Q4ETOfII5W1oNCKvR5w==	[read:authorizations write:authorizations read:buckets write:buckets read:dashboards write:dashboards read:orgs write:orgs read:sources write:sources read:tasks write:tasks read:telegrafs write:telegrafs read:users write:users read:variables write:variables read:scrapers write:scrapers read:secrets write:secrets read:labels write:labels read:views write:views read:documents write:documents read:notificationRules write:notificationRules read:notificationEndpoints write:notificationEndpoints read:checks write:checks read:dbrp write:dbrp read:notebooks write:notebooks read:annotations write:annotations read:remotes write:remotes read:replications write:replications]
                                

                                Aber wie soll ich über das GUI einen alten Token rekonstruieren?

                                Marc Berg 1 Reply Last reply Reply Quote 0
                                • Marc Berg
                                  Marc Berg Most Active @akuhnsh last edited by Marc Berg

                                  @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                                  Aber wie soll ich über das GUI einen alten Token rekonstruieren?

                                  Das würde nur Sinn ergeben, wenn die alte Datenbank noch vorhanden ist. Hast du bei der Neuinstallation die Datenverzeichnisse gelöscht? Nicht, dass da noch Überbleibsel stören.

                                  akuhnsh 1 Reply Last reply Reply Quote 0
                                  • akuhnsh
                                    akuhnsh @Marc Berg last edited by

                                    @marc-berg said in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                                    @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                                    Aber wie soll ich über das GUI einen alten Token rekonstruieren?

                                    Das würde nur Sinn ergeben, wenn die alte Datenbank noch vorhanden ist. Hast du bei der Neuinstallation die Datenverzeichnisse gelöscht? Nicht, dass da noch Überbleibsel stören.

                                    Die alten token sind halt im Backup hinterlegt. Wenn die abgefragt werden, passen sie halt nicht zum neuen admin token. Deshalb dachte ich an die Rolle Rückwärts. Erst die alten token wieder herstellen in der neuen influxdb und dann das Backup mit denselben token wieder herstellen. Denke da liegt der Hase im Pfeffer.

                                    Hab ein frisches Image auf die SD geschrieben und iobroker, influxdb und grafana neu installiert, wie sonst auch schon mal. Da sollte alles überschrieben worden sein.

                                    Marc Berg 1 Reply Last reply Reply Quote 0
                                    • Marc Berg
                                      Marc Berg Most Active @akuhnsh last edited by Marc Berg

                                      @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                                      Die alten token sind halt im Backup hinterlegt. Wenn die abgefragt werden, passen sie halt nicht zum neuen admin token. Deshalb dachte ich an die Rolle Rückwärts. Erst die alten token wieder herstellen in der neuen influxdb und dann das Backup mit denselben token wieder herstellen. Denke da liegt der Hase im Pfeffer.

                                      nein. Du brauchst nach einer Neuinstallation für das Restore die neuen Token.

                                      akuhnsh 2 Replies Last reply Reply Quote 0
                                      • akuhnsh
                                        akuhnsh @Marc Berg last edited by

                                        @marc-berg

                                        Okay verstanden. Vielleicht liegt es daran, dass das Backup aus einem Upgrade von 1.8 auf 2.7 stammt? Ich bekomme eine ähnliche Fehlermeldung, wenn ich in dem Reiter bei "Sicherung mehrerer Systeme" raus nehme und nur den einen bucket angebe, der auch zur Wiederherstellung ausgewählt wird:

                                        Started restore ...
                                        [DEBUG] [influxDB] - Created tmp directory
                                        [DEBUG] [influxDB] - Start infuxDB Restore ...
                                        [ERROR] [influxDB] - 2023/05/06 16:49:17 INFO: Restoring bucket "0db467ecf59e0de1" as "iobroker"
                                        Error: failed to restore bucket "iobroker": 401 Unauthorized: read:authorizations is unauthorized
                                        
                                        [DEBUG] [influxDB] - Try deleting the InfluxDB tmp directory
                                        [DEBUG] [influxDB] - InfluxDB tmp directory was successfully deleted
                                        [DEBUG] [influxDB] - infuxDB Restore completed successfully
                                        [EXIT] influxDB restore done
                                        

                                        Muss der neue token noch irgendwo anders eingetragen werden als unter dem Reiter influxdb in BackItUp?

                                        Marc Berg 1 Reply Last reply Reply Quote 0
                                        • akuhnsh
                                          akuhnsh @Marc Berg last edited by

                                          @marc-berg

                                          Wenn ich den bucket vorher anlege kommt folgende Meldung:

                                          Started restore ...
                                          [DEBUG] [influxDB] - Created tmp directory
                                          [DEBUG] [influxDB] - Start infuxDB Restore ...
                                          [ERROR] [influxDB] - 2023/05/06 16:52:54 INFO: Restoring bucket "0db467ecf59e0de1" as "iobroker"
                                          Error: failed to restore bucket "iobroker": 422 Unprocessable Entity: bucket with name iobroker already exists
                                          
                                          [DEBUG] [influxDB] - Try deleting the InfluxDB tmp directory
                                          [DEBUG] [influxDB] - InfluxDB tmp directory was successfully deleted
                                          [DEBUG] [influxDB] - infuxDB Restore completed successfully
                                          [EXIT] influxDB restore done
                                          
                                          1 Reply Last reply Reply Quote 0
                                          • Marc Berg
                                            Marc Berg Most Active @akuhnsh last edited by

                                            @akuhnsh sagte in InfluxDB 2.7 Totalabsturz - BackItUp mit Fehler:

                                            Vielleicht liegt es daran, dass das Backup aus einem Upgrade von 1.8 auf 2.7 stammt

                                            Das verstehe ich nicht. Das Backup wurde doch aber nach dem Upgrade aus der 2.7 gezogen, oder?

                                            Ganz sicher, dass der Token, der auch über "influx auth list" sichtbar ist, korrekt und ohne Leerzeichen im Backitip-Adapter eingetragen ist? An eine andere Stelle muss der für den reinen Rstore nicht eingetragen werden.

                                            akuhnsh 1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post

                                            Support us

                                            ioBroker
                                            Community Adapters
                                            Donate
                                            FAQ Cloud / IOT
                                            HowTo: Node.js-Update
                                            HowTo: Backup/Restore
                                            Downloads
                                            BLOG

                                            464
                                            Online

                                            31.6k
                                            Users

                                            79.5k
                                            Topics

                                            1.3m
                                            Posts

                                            5
                                            28
                                            1162
                                            Loading More Posts
                                            • Oldest to Newest
                                            • Newest to Oldest
                                            • Most Votes
                                            Reply
                                            • Reply as topic
                                            Log in to reply
                                            Community
                                            Impressum | Datenschutz-Bestimmungen | Nutzungsbedingungen
                                            The ioBroker Community 2014-2023
                                            logo