Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use cd in ci #940

Open
yhan219 opened this issue Apr 17, 2023 · 9 comments
Open

how to use cd in ci #940

yhan219 opened this issue Apr 17, 2023 · 9 comments
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@yhan219
Copy link

yhan219 commented Apr 17, 2023

What is version of KubeSphere DevOps has the issue?

v3.3.2

How did you install the Kubernetes? Or what is the Kubernetes distribution?

k8s v1.22.12

What happened?

Now cd can be used in ci, but when I use it, it is not successful, what do I need to do

Relevant log output

io.fabric8.kubernetes.client.http.WebSocketHandshakeException
	at io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:65)
	at okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
	at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
	at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
	at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
	Suppressed: java.lang.Throwable: waiting here
		at io.fabric8.kubernetes.client.utils.Utils.waitUntilReady(Utils.java:164)
		at io.fabric8.kubernetes.client.utils.Utils.waitUntilReadyOrFail(Utils.java:175)
		at io.fabric8.kubernetes.client.dsl.internal.core.v1.PodOperationsImpl.exec(PodOperationsImpl.java:322)
		at io.fabric8.kubernetes.client.dsl.internal.core.v1.PodOperationsImpl.exec(PodOperationsImpl.java:84)
		at org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1.doLaunch(ContainerExecDecorator.java:427)
		at org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1.launch(ContainerExecDecorator.java:344)
		at hudson.Launcher$ProcStarter.start(Launcher.java:507)
		at org.jenkinsci.plugins.durabletask.BourneShellScript.launchWithCookie(BourneShellScript.java:176)
		at org.jenkinsci.plugins.durabletask.FileMonitoringTask.launch(FileMonitoringTask.java:132)
		at org.jenkinsci.plugins.workflow.steps.durable_task.DurableTaskStep$Execution.start(DurableTaskStep.java:320)
		at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:319)
		at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:193)
		at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:122)
		at jdk.internal.reflect.GeneratedMethodAccessor539.invoke(Unknown Source)
		at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
		at java.base/java.lang.reflect.Method.invoke(Method.java:566)
		at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
		at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
		at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213)
		at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
		at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
		at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
		at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
		at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:163)
		at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
		at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:158)
		at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:161)
		at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:165)
		at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:135)
		at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:135)
		at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:135)
		at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
		at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:86)
		at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:113)
		at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:83)
		at jdk.internal.reflect.GeneratedMethodAccessor160.invoke(Unknown Source)
		at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
		at java.base/java.lang.reflect.Method.invoke(Method.java:566)
		at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
		at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
		at com.cloudbees.groovy.cps.Next.step(Next.java:83)
		at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
		at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
		at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
		at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
		at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
		at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
		at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:51)
		at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:185)
		at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:402)
		at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$400(CpsThreadGroup.java:96)
		at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:314)
		at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:278)
		at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:67)
		at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
		at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:139)
		at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
		at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
		at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
		at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
		... 3 more
Caused by: java.net.ProtocolException: Expected HTTP 101 response but was '400 Bad Request'
	at okhttp3.internal.ws.RealWebSocket.checkResponse(RealWebSocket.java:229)
	at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:196)
	... 5 more
An error has occurred.
我的jenkinsfile

Additional information

my jenkins file:

pipeline {
  agent {
    node {
      label 'maven'
    }

  }
  stages {
    stage('clone code') {
      agent none
      steps {
        container('maven') {
          git(url: 'https://e.coding.net/test/test/restaurant.git', credentialsId: 'coding-id', changelog: true, poll: false)
        }

      }
    }

    stage('build & push') {
      agent none
      steps {
        container('maven') {
          sh 'mvn clean package -Dmaven.test.skip=true'
          sh 'docker build -f $DOCKER_FILE_PATH/Dockerfile -t $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:SNAPSHOT-$BRANCH_NAME-$BUILD_NUMBER .'
          withCredentials([usernamePassword(passwordVariable : 'DOCKER_PASSWORD' ,usernameVariable : 'DOCKER_USERNAME' ,credentialsId : "$DOCKER_CREDENTIAL_ID" ,)]) {
            sh 'echo "$DOCKER_PASSWORD" | docker login $REGISTRY -u "$DOCKER_USERNAME" --password-stdin'
            sh 'docker push  $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:SNAPSHOT-$BRANCH_NAME-$BUILD_NUMBER'
          }

        }

      }
    }

    stage('push latest') {
      agent none
      steps {
        container('maven') {
          sh 'docker tag  $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:SNAPSHOT-$BRANCH_NAME-$BUILD_NUMBER $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:latest '
          sh 'docker push  $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:latest '
        }

      }
    }

    stage('持续部署') {
      agent none
      steps {
        container('base') {
          withCredentials([usernamePassword(credentialsId : 'coding-id' ,passwordVariable : 'PASS' ,usernameVariable : 'USER')]) {
            sh '''ks app update --app-name restaurant-cd \\
 --app-namespace dong42xjfqc \\
 --name core.harbor.domain/rest/restaurant:latest \\
 --newName core.harbor.domain/rest/restaurant:latest \\
 --git-password $PASS --git-username=$USER \\
 --git-target-branch master'''
          }

        }

      }
    }

  }
  environment {
    DOCKER_CREDENTIAL_ID = 'harbor-id'
    REGISTRY = 'core.harbor.domain'
    DOCKERHUB_NAMESPACE = 'rest'
    GITHUB_ACCOUNT = 'kubesphere'
    DOCKER_FILE_PATH = '.'
    APP_NAME = 'restaurant'
  }
}
@yhan219 yhan219 added the kind/bug Categorizes issue or PR as related to a bug. label Apr 17, 2023
@yhan219
Copy link
Author

yhan219 commented Apr 17, 2023

image

@yhan219
Copy link
Author

yhan219 commented Apr 17, 2023

The initial suspicion is that the previous steps all use container('maven'), and the last step cd uses container('base'), but if I replace the cd with maven, the build prompts that the ks command does not exist.

@Aurorxa
Copy link

Aurorxa commented Apr 21, 2023

那是因为你默认使用的是 maven ,你可以在最后一个步骤使用 base

stage('持续部署') {
       agent {
          node {
            label 'base'
          }
      
        }
      steps {
        container('base') {
          withCredentials([usernamePassword(credentialsId : 'coding-id' ,passwordVariable : 'PASS' ,usernameVariable : 'USER')]) {
            sh '''ks app update --app-name restaurant-cd \\
 --app-namespace dong42xjfqc \\
 --name core.harbor.domain/rest/restaurant:latest \\
 --newName core.harbor.domain/rest/restaurant:latest \\
 --git-password $PASS --git-username=$USER \\
 --git-target-branch master'''
          }

        }

      }
    }

不过,我已经测试过了,很有问题啊,我的仓库是在 gitee 上的,直接报错!!!

@yhan219
Copy link
Author

yhan219 commented Apr 21, 2023

最后一个如果用mave会报没有ks命令的错误。

@Aurorxa
Copy link

Aurorxa commented Apr 22, 2023

最后一个如果用mave会报没有ks命令的错误。

就是因为默认的 Maven 镜像中没有安装 ks 命令行工具,所以会报错,你可以基于官方的镜像,将 ks 命令行工具安装进去,然后再在 Jenkins 中将官方的 maven 镜像替换,就可以的!!

不过,就算你做到这一步,依然会报错;别问,我试过很多方法,甚至手动 run 一个 Pod 去运行,但是发现 ks 工具好像有问题,就是不能拉取 gitee 上的私有仓库,太奇怪了!!,只能选择我上面提供的方法。

@Aurorxa
Copy link

Aurorxa commented Apr 22, 2023

最后一个如果用mave会报没有ks命令的错误。

就是因为默认的 Maven 镜像中没有安装 ks 命令行工具,所以会报错,你可以基于官方的镜像,将 ks 命令行工具安装进去,然后再在 Jenkins 中将官方的 maven 镜像替换,就可以的!!

不过,就算你做到这一步,依然会报错;别问,我试过很多方法,甚至手动 run 一个 Pod 去运行,但是发现 ks 工具好像有问题,就是不能拉取 gitee 上的私有仓库,太奇怪了!!,只能选择我上面提供的方法。

stage('持续部署') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "持续部署"
                    }
                }

                container('maven') {
                    // 清空当前目录
                    sh '''
                      pwd && ls -lah
                      rm -rf *
                      pwd && ls -lah
                      '''
                    // 下载维护 kustomize 配置仓库的源码,因为我的项目源码和配置清单是分开在两个仓库中的
                    git(credentialsId: 'gitee-id', url: 'xxx', branch: 'master', changelog: true, poll: false)
                    // 更新配置
                    withCredentials([usernamePassword(credentialsId: "gitee-id",
                            usernameVariable: "GIT_USERNAME",
                            passwordVariable: "GIT_PASSWORD")]) {
                        sh 'git config --local credential.helper "!p() { echo username=\\$GIT_USERNAME; echo password=\\$GIT_PASSWORD; }; p"'
                        sh 'git config --global user.name "${GIT_USERNAME}"'
                        sh 'git config --global user.email "${GIT_USERNAME}"'
                        sh '''
                          pwd && ls -lah && cd xxx/xxxxxxxx && ls -lah
                          sed -i "s#registry-vpc.*#$REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:v$BUILD_NUMBER#g" deployment.yaml
                          git add -A && git commit -m "update tag: v$BUILD_NUMBER" && git push -u origin master  
                          '''
                    }
                }
            }
        }

@yhan219
Copy link
Author

yhan219 commented Apr 24, 2023

有点懵,我现在加了个jenkins agent

 - name: "maven-base"
            label: "maven-base"
            inheritFrom: "maven"
            containers:
            - name: "maven"
              image: "kubesphere/builder-maven:v3.2.1-jdk11"
            - name: "base"
              image: "kubesphere/builder-base:v3.2.2"
              command: "cat"
              args: ""
              ttyEnabled: true
              privileged: false
              resourceRequestCpu: "100m"
              resourceLimitCpu: "4000m"
              resourceRequestMemory: "100Mi"
              resourceLimitMemory: "8192Mi"

使用label maven-base后,最后一步cd:

stage('cd') {
      steps {
        container('base') {
          withCredentials([usernamePassword(credentialsId : 'coding-id' ,passwordVariable : 'PASS' ,usernameVariable : 'USER')]) {
            sh '''ks app update --app-name restaurant-cd \\
 --app-namespace dong42xjfqc \\
 --name core.harbor.domain/rest/restaurant:latest \\
 --newName core.harbor.domain/rest/restaurant:latest \\
 --git-password $PASS --git-username=$USER \\
 --git-target-branch master'''
          }

        }

      }
    }

还是报错

  ks app update --app-name restaurant-cd --app-namespace dong42xjfqc --name core.harbor.domain/rest/restaurant:latest --newName core.harbor.domain/rest/restaurant:SNAPSHOT--34 --git-password **** [email protected] --git-target-branch master
Error: Missing kustomization file 'kustomization.yaml'.

Usage:
  kustomize edit set image [flags]

Examples:

The command
  set image postgres=eu.gcr.io/my-project/postgres:latest my-app=my-registry/my-app@sha256:24a0c4b4a4c0eb97a1aabb8e29f18e917d05abfe1b7a7c07857230879ce7d3d3
will add

images:
- name: postgres
  newName: eu.gcr.io/my-project/postgres
  newTag: latest
- digest: sha256:24a0c4b4a4c0eb97a1aabb8e29f18e917d05abfe1b7a7c07857230879ce7d3d3
  name: my-app
  newName: my-registry/my-app

to the kustomization file if it doesn't exist,
and overwrite the previous ones if the image name exists.

The command
  set image node:8.15.0 mysql=mariadb alpine@sha256:24a0c4b4a4c0eb97a1aabb8e29f18e917d05abfe1b7a7c07857230879ce7d3d3
will add

images:
- name: node
  newTag: 8.15.0
- name: mysql
  newName: mariadb
- digest: sha256:24a0c4b4a4c0eb97a1aabb8e29f18e917d05abfe1b7a7c07857230879ce7d3d3
  name: alpine

to the kustomization file if it doesn't exist,
and overwrite the previous ones if the image name exists.

The image tag can only contain alphanumeric, '.', '_' and '-'. Passing * (asterisk) either as the new name, 
the new tag, or the digest will preserve the appropriate values from the kustomization file.


Flags:
  -h, --help   help for image

Global Flags:
      --stack-trace   print a stack-trace on error

Usage:
  ks app update [flags]

Aliases:
  update, up

Flags:
      --app-name string            The name of the application
      --app-namespace string       The namespace of the application
  -d, --digest string              Digest is the value used to replace the original image tag. If digest is present NewTag value is ignored
      --git-password string        The password of the git provider
      --git-provider string        The flag --mode=pr need the git provider, the mode will fallback to commit if the git provider is empty
      --git-target-branch string   The target branch name that you want to push
      --git-username string        The username of the git provider
  -h, --help                       help for update
  -n, --name string                Name is a tag-less image name
      --newName string             NewName is the value used to replace the original name
  -t, --newTag string              NewTag is the value used to replace the original tag
      --secret-name string         The username of the git provider
      --secret-namespace string    The username of the git provider

Global Flags:
      --context string   Sets a context entry in kubeconfig

failed to push changes, error is: failed to push branch, error is: unable to checkout git branch: 397527233, error: a branch named "refs/heads/master" already exists
script returned exit code 1

@yhan219
Copy link
Author

yhan219 commented Apr 24, 2023

没有文档,感觉是个实验功能

@Aurorxa
Copy link

Aurorxa commented Apr 25, 2023

没有文档,感觉是个实验功能

额,感觉你还是没有理解,在 pipeline 中可以在每个 stage 中使用各自的 agent 的,无需重新配置:

pipeline {
    agent {  // 这边配置的是全局,每个步骤 stages 都可以使用
        node {
            label 'mavenjdk11'
        }

    }

    options { // 可选项
        timeout(time: 5, unit: 'HOURS')
    }

    environment { // 环境变量
        DOCKER_CREDENTIAL_ID = 'aliyun-docker-id'
        REGISTRY = 'XXX'
        DOCKERHUB_NAMESPACE = 'XXX'
        APP_NAME = 'XXXX'
    }

    stages {
        stage('拉取代码') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "拉取代码"
                    }
                }

                git(credentialsId: 'gitee-id', url: 'XXXX', branch: 'master', changelog: true, poll: false)
            }
        }

        stage('项目编译') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "项目编译"
                    }
                }

                container('maven') {
                    sh 'mvn clean package -Dmaven.test.skip=true'
                }

            }
        }

        stage('构建镜像') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "构建镜像"
                    }
                }

                container('maven') {
                    sh 'docker build --no-cache --force-rm -t $APP_NAME:latest -f Dockerfile .'
                }

            }
        }

        stage('推送镜像') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "推送镜像"
                    }
                }

                container('maven') {
                    withCredentials([usernamePassword(credentialsId: 'aliyun-docker-id', usernameVariable: 'DOCKER_USER_VAR', passwordVariable: 'DOCKER_PASSWORD_VAR')]) {
                        sh 'echo "$DOCKER_PASSWORD_VAR" | docker login $REGISTRY -u "$DOCKER_USER_VAR" --password-stdin'
                        sh 'docker tag  $APP_NAME:latest $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:v$BUILD_NUMBER'
                        sh 'docker push $REGISTRY/$DOCKERHUB_NAMESPACE/$APP_NAME:v$BUILD_NUMBER'
                    }

                }

            }
        }

        stage('持续部署') {
            agent { // 这边配置的局部,只能 持续部署 这个 stage 使用
                node {
                    label 'base'
                }
            
            }
            steps {
                container('base') {
                withCredentials([usernamePassword(credentialsId : 'coding-id' ,passwordVariable : 'PASS' ,usernameVariable : 'USER')]) {
                    sh '''ks app update --app-name restaurant-cd \\
                    --app-namespace dong42xjfqc \\
                    --name core.harbor.domain/rest/restaurant:latest \\
                    --newName core.harbor.domain/rest/restaurant:latest \\
                    --git-password $PASS --git-username=$USER \\
                    --git-target-branch master'''
                }

                }
        }

        stage('清空镜像') {
            agent none
            steps {
                withEnv(["BUILD_STAGE_NAME=阶段的名称"]) {
                    script {
                        BUILD_STAGE_NAME = "清空镜像"
                    }
                }

                container('maven') {
                    sh 'docker system prune -af'
                }
            }
        }

    }

    post { // 后置动作
        always {
            echo '总是执行'
            sh 'pwd && ls -lah'
        }

        success {
            echo '后置执行 ---> 成功后执行...'
            mail(to: '', cc: '', subject: "${APP_NAME}的构建报告", body: '流水线构建成功^_^')
        }

        failure {
            echo '后置执行 ---> 失败后执行...'
            mail(to: '', cc: '', subject: "${APP_NAME}的构建报告", body: "${BUILD_STAGE_NAME}阶段》构建失败o(╥﹏╥)o")
        }

        aborted {
            echo '后置执行 ---> 取消后执行...'
            mail(to: '', cc: '', subject: "${APP_NAME}的构建报告", body: "${APP_NAME}流水线被手动取消了o(╥﹏╥)o")
        }

    }

}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

No branches or pull requests

2 participants