Jump to content
  • Welcome!

    We are so excited to have you here.

    Join now to access our forum discussions on all things ibi - products, solutions, how-to's, knowledgebase, peer support and more.

    Please register or login to take part. (If you had a previous community account simply click > Sign In > Forgot Password)

     

  • Our picks View All

    • We're excited to start laying out our plans for ibi Summit 2025, and we want your input!
      To help us understand your preferences, we kindly request that you take a few minutes to complete a short survey. Your feedback is essential to us. 
      • 1 reply
        • Like
    • This article outlines the steps I took to get WebFOCUS Container Edition (WF-CE) and prerequisites running in Kubernetes deployment.

      This was done with the following configuration:

      • 1 reply
        • Like
    • This video provides a basic overview of Standard Report creation using WebFOCUS 9.3 Designer. Geared towards new users, it demonstrates core features and techniques for building and formatting tabular reports. Key topics include column selection and creation, content styling, header and footer implementation, and automatic drill-down functionality. 

       

       
      • 1 reply
        • Like
    • 🏆 We have been recognized as an Experience and Trust Leader in the Dresner Advisory Services 2024 Industry Excellence Awards! 🏆 As always, we are grateful for our phenomenal customer base and look forward to continuing to build alongside you and offer exceptional products. Read more in the full press release here. 
      • 0 replies
        • Like
    • If you’re familiar with scheduling in ReportCaster, then you probably know that there are a lot of recurrence options you can use to send your content at regular intervals with a variety of time-based options. For example, you can send your content every week, every 2 weeks, or every Tuesday and Friday. What you may not know, however, is that you can also schedule those items to send more dynamically using an alert. An alert tests for certain conditions in a report, so when a schedule running against that alert is run, if the alert test fails, then the report won’t send. This allows you to schedule your content to send more dynamically based on changes or key thresholds in your data, so when you receive an email or other distribution from ReportCaster, you know it has new or important information in it.


      The following video shows how to create an alert test based on a filtered report, how to schedule the alert, and what happens if the alert passes or fails.
      Have you tried alerts before in other scenarios, or do you have questions about the capabilities of alerts? Let us know in the comments!

       
      • 0 replies
        • Like

Welcome to the ibi Community

See recent community activity

  • In today’s data-driven world, organizations are continuously looking for ways to extract meaningful insights from the vast amounts of information they collect. Machine learning (ML) has become an indispensable tool for uncovering patterns, trends, and correlations within data. Among the various machine learning techniques, clustering stands out as one of the most intuitive and valuable methods, especially when you have data that isn’t clearly labeled or classified.
    In this article, we will explore clustering in machine learning, its importance, and how WebFOCUS, a powerful business intelligence and analytics platform, facilitates clustering to drive data-driven decision-making.
    What is Clustering in Machine Learning?
    Clustering is a type of unsupervised learning technique that groups similar data points together based on their features, without any prior labels. The primary goal of clustering is to identify natural patterns or structures in data. Unlike classification, where the data is pre-labeled, clustering allows you to explore the dataset and uncover hidden relationships.
    In simple terms, clustering helps you organize data into clusters, where items within the same cluster are more similar to each other than to those in other clusters.
    How Clustering is Used in Business Intelligence
    Clustering plays a critical role in a wide range of applications in business intelligence, such as:
    Customer Segmentation: Grouping customers based on purchasing behavior, demographics, or interactions can help businesses target marketing efforts more effectively and personalize customer experiences.
    Anomaly Detection: Identifying outliers or unusual patterns in data can help with fraud detection, system monitoring, or quality control.
    Market Research: Clustering can reveal distinct segments within the market, providing insights into which products or services may appeal to different consumer groups.
    Recommendation Systems: By clustering users based on preferences or behaviors, businesses can recommend products or content that align with their interests.
    WebFOCUS and Clustering
    Let’s consider a retail company that wants to perform customer segmentation. The goal is to group customers based on their purchasing behaviors, such as purchase frequency, total spend, and product preferences.
    Step 1: Load the customer data into WebFOCUS. This data might include demographic details, transaction history, and behavioral data.
    Step 2: Use WebFOCUS's data preparation tools to clean the data. Remove any incomplete or irrelevant entries, handle missing values, and normalize numerical features like spending or purchase frequency.
    Step 3: Apply a clustering algorithm (e.g., K-Means) to the data to identify distinct groups. WebFOCUS will calculate the optimal number of clusters and segment the customers accordingly.
    Step 4: Visualize the results. WebFOCUS will create interactive dashboards to showcase the clusters, highlighting the characteristics of each group. For instance, one cluster may represent high-value, frequent buyers, while another may represent occasional shoppers.
    Step 5: Use the clusters to tailor marketing strategies. The company can develop targeted campaigns, personalized offers, or product recommendations based on the characteristics of each customer group.
    Key Benefits of Clustering in WebFOCUS
    Actionable Insights: Clustering in WebFOCUS helps organizations uncover hidden patterns and trends, leading to better decision-making.
    User-Friendly: WebFOCUS simplifies the process of applying complex machine learning algorithms, making it accessible to both technical and non-technical users.
    Scalability: Whether you are working with a small dataset or a massive enterprise data warehouse, WebFOCUS scales to meet the needs of your organization.
    Interactive Dashboards: The platform's visualization capabilities make it easy to communicate clustering results to stakeholders, helping to drive business strategies and decisions.
    Conclusion
    Clustering is a fundamental machine learning technique that helps businesses uncover hidden insights, segment their customers, and make data-driven decisions. Whether you’re looking to improve customer segmentation, detect anomalies, or enhance marketing strategies, clustering with WebFOCUS can help you leverage the full potential of your data.
     
  • When customers attempt to build images for WebFOCUS Container Edition 1.2.4 / 1.3.1 earlier versions independently using the ./build-images.sh script, they may encounter an EPEL 404 Not Found error for wfc (appserver) image, as illustrated in the screenshot below

    This error simply indicates that the EPEL version you are trying to download has expired.
    How to resolve EPEL 404 Not Found issue:
         1. Update Webfocus Client Docker file in wfc directory
     
    vi IBI_wfce_1.3.1/wfc/Dockerfile Replace
    https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
    With
    https://dl.fedoraproject.org/pub/archive/epel/7/x86_64/Packages/e/epel-release-7-14.noarch.rpm
    or use sed command as below:
    sed -i 's|https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm|https://dl.fedoraproject.org/pub/archive/epel/7/x86_64/Packages/e/epel-release-7-14.noarch.rpm|g' IBI_wfce_1.3.1/wfc/Dockerfile  
    Once updated Dockerfile should look like as below:

     
         2. Once the docker file is updated, execute ./build-images.sh again
    ./IBI_wfce_1.3.1/scripts/build-images.sh Upon successful completion of the image build process, the console should display the following output

     
    Alternatively, customers can use the pre-built images included in the WebFOCUS Container Edition installer. Please refer to the official documentation:
    https://docs.tibco.com/pub/wfce/1.3.1/doc/html/Default.htm#Installation-and-Deployment-Guide/Using_images_for_docker.htm#Download
     
    Thank you for your interest in WebFOCUS CE and we hope this documentation proves useful to you! 
    For any further assistance feel free to contact ibi Support team.
  • This article outlines the steps I took to get WebFOCUS Container Edition (WF-CE) and prerequisites running in Kubernetes deployment.
    This was done with the following configuration:
    OS: Ubuntu 24.01 (with 4 vCPU's and 16 GB RAM) Kind: v0.24.0 go1.22.6 Linux/amd64 Helm: Version:"v3.16.1" Kubectl: Client Version: v1.31.0 Helmfile:  Version 0.167.1 Docker engine: Version 24.0.7 WebFOCUS Container Edition v1.3.1 Some quick Notes:
    You will need a license.txt file and must know your customer ID. If you do not know about either, then visit our product Support website and collect first. General utility commands like ubuntu-desktop (for local browser), curl, git, gunzip, watch…etc should already be present. There are many Linux distributions, you need to know the install commands for that specific Linux kernel.   Cut and paste is not your friend, re: the bullets + tab will copy by default (you’ll need to watch and clean up) If you are running on windows – you will need Windows Subsystem for Linux (WSL v2) – not doc’d here. The following documentation was done on OS: Ubuntu 24.01 and reflects that configuration only.  
    In the next few steps, I am installing the pre-reqs and confirming the version installed. I created a directory WFCE_131 and issues the following commands three. 
     
    1. Install Kind – Quick start guide for different binaries (https://kind.sigs.k8s.io/docs/user/quick-start/ )
    # For AMD64 / x86_64 – I selected this command for my Ubuntu OS
    [ $(uname -m) = x86_64 ] && curl -Lo ./kind https://kind.sigs.k8s.io/dl/v0.24.0/kind-linux-amd64 chmod +x ./kind sudo mv ./kind /usr/local/bin/kind kind version expected Output: 
    kind v0.24.0 go1.22.6 linux/amd64
     
    2. Install helm
    # Download the install shell script
    curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/master/scripts/get-helm-3 # Change permissions allowing it to Run
    chmod 700 get_helm.sh # Install
    . ./get_helm.sh  (notice the DOT space DOT/) # Double check helm permission -
    ls -lrt /usr/local/bone (if owner is root, change to match kind owner,  should be your ID, with group 2130 in my case) sudo chown csslmd:2130 /usr/local/bin/helm   (changes owner to my ID csslmd and group 2130 matching kind) helm version expected Output
    version.BuildInfo{Version:"v3.16.1", GitCommit:"5a5449dc42be07001fd5771d56429132984ab3ab", GitTreeState:"clean", GoVersion:"go1.22.7"}
     
    3. Install kubectl
    curl -LO https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl chmod 755 kubectl sudo mv kubectl /usr/local/bin/kubectl kubectl version Expected output 
    Client Version: v1.31.0  
    Kustomize Version: v5.4.2 
    Server Version: v1.31.0
     
    4. Install Helmfile – Select the correct distribution - https://github.com/helmfile/helmfile/releases/tag/v0.167.1
    wget -O helmfile_linux_amd64 https://github.com/helmfile/helmfile/releases/download/v0.167.1/helmfile_0.167.1_linux_amd64.tar.gz mv helmfile_linux_amd64 helmfile.tar.gz  #note the above file name and what is written to disk. Need to change extension gunzip helmfile.tar.gz tar -xvf helmfile.tar (I chose to remove the 2 README files as well as the LICENSE file – not needed - helmfile is what you are looking for) sudo mv helmfile /usr/local/bin/helmfile helmfile version expected output
     helmfile  Version            0.167.1
      Git Commit         86664f5
      Build Date         03 Aug 24 22:03 EDT (1 month ago)
      Commit Date        02 Aug 24 19:53 EDT (1 month ago)
      Dirty Build        no
      Go version         1.22.4
      Compiler           gc
      Platform           linux/amd64
     
    5. Install Docker
    sudo apt install -y docker.io sudo vi /etc/group   #vi to edit the group file and add your userID, (mine is csslmd) appended to the docker entry,  e.g: docker:x:124:csslmd   after adding ID to the docker group and saving , you MUST log out and back in to take effect. docker version expected output
    Client:
     Version:           24.0.7
     API version:       1.43
     Go version:        go1.22.2
     Git commit:        24.0.7-0ubuntu4.1
     Built:             Fri Aug  9 02:33:20 2024
     OS/Arch:           linux/amd64
     Context:           default (….. etc , yes there is more....)
     
    6. Confirm your pre-requisites are in the /usr/local/bin, with correct permissions (adjust if needed). This location is in your path by default.
    ls -lrt /usr/local/bin -rwxr-xr-x 1 csslmd csslmd 91693208 Aug  3 22:07 helmfile
    -rwxrwxr-x 1 csslmd csslmd  9930525 Sep 25 09:09 kind
    -rwxr-xr-x 1 csslmd csslmd 57122968 Sep 25 14:43 helm
    -rwxr-xr-x 1 csslmd csslmd 56381592 Sep 25 15:38 kubectl
    sudo ls -ltr /var/lib/docker    #Docker is written to a different location drwx------ 2 root root 4096 Oct 15 02:03 runtimes
    drwx--x--- 2 root root 4096 Oct 15 02:03 containers
    drwx-----x 2 root root 4096 Oct 15 02:03 volumes
    drwx------ 4 root root 4096 Oct 15 02:03 plugins
    -rw------- 1 root root   36 Oct 15 02:03 engine-id
    drwx------ 3 root root 4096 Oct 15 02:03 image
    drwxr-x--- 3 root root 4096 Oct 15 02:03 network
    drwx------ 2 root root 4096 Oct 15 02:03 swarm
    drwx--x--- 3 root root 4096 Oct 15 02:03 overlay2
    drwx--x--x 4 root root 4096 Oct 15 02:03 buildkit
    drwx------ 2 root root 4096 Oct 15 02:03 tmp
    ***************** Pre-reqs are completed *******************
    Installing WF-CE (as of the date this doc was written, 10/2024, 1.3.1 was the current version)
    1. Download at minimum 2 tar files from e-delivery and doc  - ibi WebFOCUS - Container Edition Add-on
    IBI_wfce_1.3.1.tar IBI_wfce_images_1.3.1.tar Create a directory for the file. (I created WFCE_131 and placed the files there.)
     
    2. load docker images from downloaded wfce images tar file:
    docker load -i IBI_wfce_images_1.3.1.tar (from location where you copied the tar files) docker images (this command shows newly loaded images, output in screenshot below) NOTE: If the docker command requires sudo to run, you may not have logged out and back in (as directed above)
     
    3. Now you can untar wf-ce install with the below command, which will untar WF-CE into a directory called IBI_wfce_1.3.1
    tar -xvf IBI_wfce_1.3.1.tar  
    4. We will now create a single node Kubernetes cluster using kind, called wf-ce.
    First upload the kind.with.ingress.yaml (from the attached yaml_files.zip I have attached to this topic) BEFORE running the kind command below. (I put mine in WFCE_131/)
    kind create cluster --name wf-ce --config kind.with.ingress.yaml expected output:

    Check cluster with the following command
    kubectl cluster-info expected output:

    5. Set the required environment variables and then load images to kind cluster.
    run the following command from WFCE_131/IBI_wfce_1.3.1/scripts/helmfile
    . ./export-defaults.sh  (notice the DOT space DOT/export-defaults.sh) expected output:
     
    Type env to view your environment variables to confirm they are set. (the names + version # should look familiar)
    To load images to kind cluster, wf-ce, created above, run the following commands
    kind load docker-image ibi2020/webfocus:wfc-9.3-1.3.1 --name wf-ce kind load docker-image ibi2020/webfocus:wfs-9.3-1.3.1 --name wf-ce kind load docker-image ibi2020/webfocus:wfs-etc-9.3-1.3.1 --name wf-ce kind load docker-image ibi2020/webfocus:cm-9.3-1.3.1 --name wf-ce expected output:
     
    6. Install NGINX Ingress controller by running the following command:  
    kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/deploy/static/provider/kind/deploy.yaml expected output:
     
    Run the following command to validate Ingress is installed properly
    kubectl wait --namespace ingress-nginx --for=condition=ready pod --selector=app.kubernetes.io/component=controller --timeout=90s expected output:

    7. View the images on the wf-ce-control plane
    docker exec -it wf-ce-control-plane crictl images expected output:
     
    8. Use helmfile to build and deploy the infrastructure required by WebFOCUS CE to run. Running the below command builds and deploys the pods from the images onto the control-plane
    cd WFCE_131/IBI_wfce_1.3.1/scripts/helmfile/infra/  helmfile sync NOTE: Since lots will scroll by from the above command, you can run the following command , from a separate windows, to monitor as the infrastructure is getting built/deployed in the webfocus namespace
    watch kubectl get pods -n webfocus expected output of the watch command as it runs, note the next 2 screen shots are while running and then when all 'Ready' (screen updates every 2 seconds)
     

    9. installing the WF-CE License –
    As mentioned in the NOTES section at the top, you will require a license.txt file and must know your customer ID. If you do not know about either, then visit our product Support website.
    From WFCE_131/IBI_wfce_1.3.1/scripts/helmfile/environments/ directory,  make a backup copy of wf.integ.yaml.gotmpl (in case you make a vi mistake)
    cp wf.integ.yaml.gotmpl wf.integ.yaml.gotmpl.orig vi wf.integ.yaml.gotmpl Update wf.integ.yaml.gotmpl with your provided CUSTOMERID and accept the EULA by adding Y. - you are modifying the below two lines only
     WF_CUSTOMERID: "999999"  #Customer ID
     ACCEPT_EUA: "Y"  #You must accept the End User Agreement by setting to "Y"
    Copy the provided active license.txt file to the following location WFCE_131/IBI_wfce_1.3.1/scripts/license. (you can overwrite what is there) 10. Load the prebuilt WebFOCUS images and deploy the pods to wf-ce control-plane  
    cd WFCE_131/IBI_wfce_1.3.1/scripts/helmfile  helmfile -e dev sync This is deploying the WF components to Kubernetes with a development profile – ‘dev’ and lots will scroll by, so we watch.
    Again you can run to monitor as it builds
    watch kubectl get pods -n webfocus Note – if smoke test fails – don’t panic – move forward.
     
    11. To complete/fix ingress by adding your hostname
    From the WFCE_131/IBI_wfce_1.3.1/scripts/helmfile/environments  directory Please create local-ingress.yaml (using the provided file yaml_files.zip) Modify that local-ingress.yaml, by changing the last two of the three localhost values to your hostname (lines 44 and 54 - in lower case) and run the below command
    kubectl apply -f local-ingress.yaml   (using the yaml file you created) expected output
     
    This was done on a local install of Ubuntu Desktop. You can confirm success with the overall install and ingress change, from a browser local to the host, just put ‘localhost’ as the URL and it should resolve.  
    If you see the login screen below, with just typing in localhost you are good! The default credentials are located in the TIBCO WebFOCUS® Container Edition Installation and Deployment Guide
    Please feel free to reach out with any questions. 
     


     
    yaml_files.zip
  • Get ready to experience a new level of portal creation starting with WebFOCUS release v9.3.0! 

    We've been busy enhancing the WebFOCUS Portal design to make building and managing your portals smoother and more intuitive than ever. Here's a glimpse of the exciting updates. WebFOCUS Portals now goes beyond basic data presentation. The new design offers a rich, interactive environment where you can explore data, uncover hidden patterns, and collaborate with colleagues. 
    Access to the new interface:
    You can now access the new Portals UI from the Hub. They can click on the Start Something new menu and then choose Create Portal to launch the tool
     

    Streamlined Portal Creation
    We've integrated the portal creation options directly into the Designer Portal editor. This means you can now set all the essential portal properties, such as title, alias, theme, and navigation layout, within the same environment where you design your portal content.
    When creating a new portal, you'll be prompted to select a save location within your repository. This ensures that your portal is organized from the start and that you have full control over its placement.

     
    Portal Page Previews
    To give you a clearer picture of your portal's structure and design, we've added a preview feature to the Portal editor. Now, when you select a page or item within the tool, you'll see a non-interactive preview of how it will appear in the portal. This preview is cached for quick loading, so you can efficiently navigate and review your portal layout.
    Drag-and-Drop Navigation Builder
    Organizing your portal's navigation has never been easier. You can now drag and drop sections or pages onto the navigation toolbar. You can also drag and drop content directly into the navigation toolbars, reorder sections and pages, move them between horizontal and vertical toolbars, and even create submenus. This intuitive interface gives you complete control over your portal's navigation structure.
    Portals overview.mp4
    Flexible Content Linking
    When adding content to your portal, you now have the choice to add it as a copy or a shortcut. This gives you greater flexibility in managing your content and ensuring that changes are reflected across your portals as desired.
    Integrated Properties Panel
    You can now access and modify the properties of sections, pages, and content items directly from the portal editor. This eliminates the need to switch back to the Hub and provides a more streamlined workflow.
     

     
    Easy Sections Management
    Adding folders to your portal is now a simple drag-and-drop operation. You can easily create new sections to organize your content, and customize its properties, all within the portal editor. The section, when created, will add a folder on the Hub where the related content can be accessed
    Convenient Right-Click Options
    Right-clicking on a section or page in your portal now provides quick access to common tasks such as Run, Edit, Hide at runtime, Delete and View Properties. This context-sensitive menu makes managing your portal content more efficient.

     
    Enhanced Theme and Logo Options
    We've added theme and logo options to the portal properties panel, giving you more control over your portal's appearance and branding. You can easily select themes, upload custom logos, and customize logo tooltips.
     

     
    Seamless Autosaving
    All changes made within the portal editor are autosaved in real-time, ensuring your work is always preserved. This includes changes to both portal content and properties, providing a smooth and worry-free editing experience.
    These enhancements are designed to make portal creation more intuitive, efficient, and user-friendly. We hope these improvements help you build better portals with ease!
     
  • In recent years, machine learning has transformed how we analyze data and make predictions across various domains. One particularly fascinating application is time series forecasting, a technique used to predict future values based on previously observed values. From stock prices to weather patterns, time series forecasting is essential for making informed decisions in numerous fields.
    What is Time Series Data?
    Before diving into forecasting methods, let’s define what time series data is. A time series is a sequence of data points collected or recorded at successive points in time, typically at uniform intervals. Examples include:
    Financial data: Stock prices, sales figures, or economic indicators.
    Environmental data: Temperature readings, humidity levels, or pollution measurements.
    Web analytics: Page views, user sessions, or conversion rates.
    The unique characteristic of time series data is its temporal ordering, meaning the order of the observations matters. This temporal structure allows for patterns and trends to be identified, which are crucial for accurate forecasting.
    The Importance of Time Series Forecasting
    Time series forecasting has a wide range of applications, including:
    Business and Finance: Companies use forecasting to predict sales, optimize inventory levels, and budget for future expenses.
    Weather Prediction: Meteorologists utilize forecasting to provide accurate weather reports.
    Healthcare: Hospitals use it for patient admissions forecasting to manage resources effectively.
    Energy: Utility companies forecast demand to ensure they can meet consumption needs without overproducing.
    Given the potential impact of accurate predictions, organizations are increasingly adopting machine learning techniques for time series forecasting.
    Time Series Forecasting in ibi WebFOCUS
    ibi WebFOCUS provides powerful tools for time series forecasting, enabling organizations to analyze historical data and make accurate predictions. With its intuitive interface, users can easily create forecasts using various statistical and machine learning models. WebFOCUS supports advanced features like trend analysis, and real-time data integration, allowing businesses to adapt their strategies based on predictive insights. Whether for sales forecasting, inventory management, or financial planning, ibi WebFOCUS streamlines the forecasting process, making it accessible for users at all skill levels.

    Steps to Forecasting with Machine Learning
    Data Collection:  Gather historical time series data relevant to the forecasting problem.
    Data Preprocessing: Clean the data, handle missing values, and transform it as necessary (e.g., normalization).
    Feature Engineering: Create additional features that might help the model, such as lagged variables, moving averages, or date-related features (e.g., day of the week).
    Model Selection: Choose appropriate machine learning models based on the nature of the data and the problem.
    Model Training: Train the model on a training dataset while validating its performance using a validation set.
    Evaluation: Assess the model using metrics like Mean Absolute Error (MAE) or Root Mean Squared Error (RMSE).
    Forecasting: Use the trained model to make predictions on future values.
    Iterate: Continuously refine the model based on new data and changing patterns.
    Conclusion
    Time series forecasting is an essential aspect of data analysis that has gained immense importance with the advent of machine learning, and ibi WebFOCUS is at the forefront of this evolution. By leveraging its powerful algorithms and user-friendly interface, organizations can make more accurate predictions and informed decisions across various sectors, from finance to healthcare. Time series forecasting within WebFOCUS not only enhances analytical capabilities but also provides a significant competitive advantage.
     
     
  • In the ever-evolving landscape of technology, the concept of Zero UI has emerged as a compelling vision for the future of human-computer interaction. As artificial intelligence (AI) continues to advance and natural language processing (NLP) capabilities improve, we are on the cusp of a paradigm shift in the way we interact with digital tools. Zero UI envisions a world where traditional graphical user interfaces (GUIs) with their menus, icons, and buttons become increasingly obsolete, replaced by more intuitive and seamless interactions driven by AI and chat interfaces.
     
    The Evolution of User Interfaces
    The evolution of user interfaces has been marked by a continuous quest for simplicity and efficiency. From command-line interfaces to GUIs, each step has brought us closer to a more natural and user-friendly interaction with computers. However, GUIs, while a significant improvement, still rely on users learning and navigating complex systems of visual elements.
     
    Enter AI and the Rise of Chat Interfaces
    The advent of AI and NLP has opened up new possibilities for user interfaces. Chat interfaces, powered by sophisticated AI models, allow users to interact with computers using natural language, much like conversing with another person. This shift towards conversational interaction represents a fundamental departure from traditional GUIs and paves the way for the emergence of Zero UI.
     
    Zero UI: The End of GUIs?
    Zero UI envisions a future where the need for explicit user interfaces diminishes as AI becomes capable of understanding and anticipating user needs. Instead of relying on visual cues and navigation, users will interact with digital tools through voice commands, gestures, or even just by expressing their intent. AI-powered systems will proactively interpret user needs and provide relevant information or perform tasks without requiring explicit instructions.
     
    The Impact on Traditional Digital Tools
    The rise of Zero UI will have a profound impact on the design and functionality of traditional digital tools.
    Simplified User Experience: With Zero UI, the complexity of GUIs will give way to a more streamlined and intuitive user experience. Users will no longer need to learn complex menus or remember specific commands, allowing them to focus on their tasks rather than on navigating the interface. Enhanced Accessibility: Zero UI has the potential to significantly improve accessibility for individuals with disabilities. By relying on natural language and other intuitive forms of interaction, Zero UI can remove barriers for users who may have difficulty using traditional GUIs. Personalized Interactions: AI-powered Zero UI systems can learn from user behavior and preferences to provide personalized experiences. This can include tailoring information and recommendations, automating routine tasks, and even anticipating user needs before they are expressed. Increased Efficiency: By eliminating the need for explicit navigation and command execution, Zero UI can significantly increase user efficiency. This is particularly relevant for tasks that involve repetitive actions or complex workflows.  
    Challenges and Considerations
    While the vision of Zero UI is compelling, there are also challenges and considerations to address.
    User Trust and Control: As AI systems become more autonomous and proactive, it is crucial to ensure that users retain a sense of trust and control. Transparency in AI decision-making and the ability for users to override or modify AI actions will be essential. Privacy Concerns: The collection and analysis of user data required for Zero UI systems to function effectively raise concerns about privacy. Striking the right balance between personalization and privacy protection will be critical. The Human Touch: While AI can automate many tasks and interactions, there will always be a need for the human touch in certain situations. Designing Zero UI systems that seamlessly integrate human and AI capabilities will be key to creating truly effective and user-friendly experiences.  
    Conclusion
    The rise of AI and chat interfaces heralds a new era in human-computer interaction. Zero UI, with its vision of intuitive and seamless interactions, represents a potential paradigm shift in the way we interact with digital tools. While challenges remain, the potential benefits of Zero UI in terms of simplified user experiences, enhanced accessibility, personalization, and increased efficiency are significant. As AI continues to advance, we can expect to see a gradual but steady shift towards Zero UI, shaping the future of digital tools and redefining the way we interact with technology.
     
  • WebFOCUS stores information that allows you to check what’s happening on your system, but sometimes it is difficult to search for the desired information, or it has been cycled to prevent the filesystem from collapsing.
    But ibi™ offers you other options that are not known as it could (a lot of people don’t read the manuals, and I’m including myself on that affirmation), that’s why I’m going to guide you through the Security And Administration Manual and the Logging section, which allows you to redirect the information stored on the logs to a table on your preferred database, and it doesn’t even have to be on the same box!
    You can find the manual here and you also have a web page here 
    The best thing about this technique is that you can generate your own reports to retrieve the desired information and create your own Security Dashboard. In a future article, we’ll discuss how to use the REST services to gather information. You’ll then be able to mix both and get the best information for the administrators and security administrators about everything that happens through WebFOCUS.
    We’ll also need the appropriate JDBD driver of the DB that we’re going to use, so have it handy with the information of the connection to the database as we’re going to need it.
    The very first thing is to create the table where we’re going to store the logs (but we’ll also keep the logs on the filesystem). In this case, I’m going to store the audit.log information. The manual shows how to create the table under a PostgreSQL database, but I’ll also provide you with the script for MS SQL Server so you can see the differences and adapt them accordingly to your preferred database (Oracle, DB2, Derby…).
    Here are the scripts:
    MS SQL Server
    USE [DesiredDB]
    GO
    /****** Object:  Table [dbo].[wf_log] ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[wf_log](     [eventdate] [datetime] NOT NULL,     [logger] [varchar](128) NOT NULL,     [lvl] [varchar](12) NOT NULL,     [logid] [varchar](128) NULL,     [message] [varchar](255) NOT NULL,     [excptn] [text] NOT NULL ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY] GO PostgreSQL
    CREATE TABLE public.wf_log (     eventdate timestamp with time zone,     logger character varying(128) COLLATE pg_catalog."default",     level character varying(12) COLLATE pg_catalog."default",     logid character varying(128) COLLATE pg_catalog."default",     message character varying(255) COLLATE pg_catalog."default",     exception text COLLATE pg_catalog."default" GRANT UPDATE, INSERT, SELECT ON TABLE public.wf_log TO webfocus;
    Notice that some text can be customized to your needs (like the table names for example), also, check that in SQL Server, there are no ‘level’ or ‘exception’ column names (those are ‘lvl’ and ‘excptn’) as those are reserved words in SQL, but those can be used in PostgreSQL.
    Once you have the table created, it should look like this:

    The next step that we need to do is to create a backup copy of the file that we’re going to modify, if we mess with something, we’ll probably lose the filesystem logs too, so having a backup copy is always a good idea.
    Navigate to your WebFOCUS installation folder: ../ibi/WebFOCUSxx/webapps/webfocus/WEB-INF/classes and create a copy of the log4j2.xml file. Once you have the copy, we can start modifying the original.
    Edit the file with your preferred text editor.
    Following the instructions in the manual, we should go to the <RollingFile name="LOGuoa"> block in the <Appenders> section.
    Just after this block, you can add the new JDBC Block (and you can add as much as you want; during some time I redirected all my logs to both, PostgreSQL and MS SQL Server), as you can see in the following screenshot:

    Notice that my PostgreSQL was on a different server, and the MS SQL Server was on the same box as my WebFOCUS.
    Here’s the code in case you want to Copy/Paste it (just replace the {{strings}} for your proper values)
            <!--         <JDBC name="LOGpsql" tableName="public.wf_log">             <DriverManager connectionString="jdbc:postgresql://{{IP:PORT}}/{{schema}}" driverClassName="org.postgresql.Driver" username="{{username}}" password="{{password}}"/>             <Column name="eventdate" isEventTimestamp="true" />             <Column name="logger" pattern="%logger" isUnicode="false"/>             <Column name="level" pattern="%level" isUnicode="false" />             <Column name="logid" pattern="%X{userId}" isUnicode="false" />             <Column name="message" pattern="%message" isUnicode="false" />             <Column name="exception" pattern="%ex{full}" isUnicode="false"/>         </JDBC>         -->         <JDBC name="LOGmssql" tableName="{{DB_name}}.dbo.wf_log">             <DriverManager connectionString="jdbc:sqlserver://{{IP:PORT}};DatabaseName={{DB_name}};encrypt=false" driverClassName="com.microsoft.sqlserver.jdbc.SQLServerDriver" username="{{username}}" password="{{password}}" />             <Column name="eventdate" isEventTimestamp="true" />             <Column name="logger" pattern="%logger" isUnicode="false"/>             <Column name="lvl" pattern="%level" isUnicode="false" />             <Column name="logid" pattern="%X{userId}" isUnicode="false" />             <Column name="message" pattern="%message" isUnicode="false" />             <Column name="excptn" pattern="%ex{full}" isUnicode="false"/>         </JDBC> The manual explains the different patterns you can save and store on the columns, so you can customize what you want to save and store there.
    Now that the connection is created and the patterns for the log are assigned to each column, we need to tell this same file which operations need to be logged there. So we’ll need to go to the <Loggers> block and add our “appender” for every logger we want to store in tables.
    Due to the nature of the tables I created, I’m going to store all the information that comes from the logs related to any ‘com.ibi.uoa.xxxxx’ function. That includes any change on users, groups, rules, roles, signin, import or export of Change Management packages…


    You can also redirect to PostgreSQL or MS SQL Server based on these loggers, for example, signin operations can be sent to PostgreSQL, while the rest of them can be sent to MS SQL Server.
    Feel free to customize this to match your needs!
    Here’s my code:
            <Logger name="com.ibi.monitor.requests" level="info" additivity="false">             <AppenderRef ref="LOGrequests"/>         </Logger>         <Logger name="com.ibi.uoa" level="error" additivity="false">             <AppenderRef ref="LOGuoa"/>             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.impex.import" level="debug" additivity="false">             <AppenderRef ref="LOGcm_import"/>             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.impex.export" level="debug" additivity="false">             <AppenderRef ref="LOGcm_export"/>             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.groups" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.users" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.roles" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.rules" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.signin" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.ownership" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.shares" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.content" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.config" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.seats" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.caster_config" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <AppenderRef ref="LOGmssql"/>         </Logger>         <Logger name="com.ibi.uoa.magnify" level="info" additivity="false">             <AppenderRef ref="LOGuoa"/>             <AppenderRef ref="LOGmssql"/>         </Logger> And just to finish this part, you will need to add the appender to the root level=”error” block.
    That way you’ll also be able to log any error occurring on those functions that appear on the events.
            <Root level="error">             <AppenderRef ref="LOGevent" />             <AppenderRef ref="sysout" />             <!--<AppenderRef ref="LOGpsql"/>-->             <AppenderRef ref="LOGmssql"/>         </Root>
    Save all of these changes, stop your application server, make sure that it can find the proper driver, for example, copying it under ../ibi/tomcat/lib folder or adding it to the CATALINA_OPTS (in case of Tomcat), clear the application server cache (work folder in case of Tomcat) and start it again.
    You should be able to signin now in WebFOCUS, and see how the new table starts growing with data:

    Now, just create a new synonym on the WebFOCUS Reporting Server against this/these table/s (depending on how many of them you just created). 
    And once done, you can start creating your visualizations for your Dashboard:

    And build something you can use to find the information you want to see from the logs in a more comfortable view:

    Start enjoying reviewing your logs and leave apart those boring plain-text ones!
    Hope this helps you to secure even more your environment!
    Pablo Alvarez
     
  • Introduction
    When deploying WebFOCUS Container Edition in an AWS EKS cluster, you can utilize the AWS Load Balancer Controller to create an Application Load Balancer.
     
    AWS Load Balancer Controller
    AWS Load Balancer Controller is a controller to help manage Elastic Load Balancers for a Kubernetes cluster.
    It satisfies Kubernetes Ingress resources by provisioning Application Load Balancers. It satisfies Kubernetes Service resources by provisioning Network Load Balancers. Enabling AWS load balancer controller in EKS cluster
    https://docs.aws.amazon.com/eks/latest/userguide/lbc-helm.html
     

    learn more about AWS load balancer controller here:
    https://kubernetes-sigs.github.io/aws-load-balancer-controller/latest/
     
    WebFOCUS CE deployment changes
    Following changes are required in wf.integ.yaml.gotmpl
    ingress:   enabled: true   annotations:     kubernetes.io/ingress.class: "alb"     alb.ingress.kubernetes.io/certificate-arn: <acm certificate url>     alb.ingress.kubernetes.io/load-balancer-name: myingressalb     alb.ingress.kubernetes.io/scheme: internet-facing     alb.ingress.kubernetes.io/target-type: ip     alb.ingress.kubernetes.io/target-group-attributes: stickiness.enabled=true,stickiness.lb_cookie.duration_seconds=600     alb.ingress.kubernetes.io/ssl-redirect: '443'   path: /*  
    here alb ingress class in provided by AWS load balancer controller.
    Enabled session stickiness using annotations "stickiness.enabled=true,stickiness.lb_cookie.duration_seconds=600"
    Update "platform.servingDomain" with your fully qualified domain name 
     
    Updating existing WebFOCUS CE installation
    update only appserver release
    helmfile -e dev --selector name=appserver sync  
    Once sync is complete check appserver ingress defination, appserver Ingress looks like this:
    $kubectl -n webfocus get ing apserver -o yaml   apiVersion: networking.k8s.io/v1 kind: Ingress metadata:   annotations:     alb.ingress.kubernetes.io/certificate-arn: arn:aws:acm:us-west-2:12345678:certificate/adaae1da-21b8-41ce-a801-eaf5afe80d7c     alb.ingress.kubernetes.io/load-balancer-name: myingressalb     alb.ingress.kubernetes.io/scheme: internet-facing     alb.ingress.kubernetes.io/target-type: ip     alb.ingress.kubernetes.io/target-group-attributes: stickiness.enabled=true,stickiness.lb_cookie.duration_seconds=60     alb.ingress.kubernetes.io/ssl-redirect: '443'     meta.helm.sh/release-name: appserver     meta.helm.sh/release-namespace: webfocus   finalizers:   - ingress.k8s.aws/resources   generation: 2   labels:     app.kubernetes.io/component: webfocus     app.kubernetes.io/instance: appserver     app.kubernetes.io/managed-by: Helm     app.kubernetes.io/name: appserver     app.kubernetes.io/part-of: TIBCO-WebFOCUS     app.kubernetes.io/version: 1.3.2     helm.sh/chart: appserver-1.3.2   name: appserver   namespace: webfocus spec:   ingressClassName: alb   rules:   - http:       paths:       - backend:           service:             name: appserver             port:               name: port8080         path: /*         pathType: ImplementationSpecific  
    Now you can access your application by updating security group to allow your IP to access https port 443
    Get ALB address from ingress definition
    $kubectl -n webfocus get ing NAME        CLASS   HOSTS   ADDRESS                                              PORTS   AGE appserver   alb     *       myingressalb-12345678.us-west-2.elb.amazonaws.com   80      16h  
    in this case webfocus is accessible using url https://myingressalb-12345678.us-west-2.elb.amazonaws.com
  • In today's digital landscape, creating a practical business intelligence (BI) platform requires more than just advanced features and robust data-handling capabilities. It demands a deep understanding of the end-users—their needs, goals, and challenges. This is where the concept of business user personas becomes essential. For WebFOCUS, a comprehensive and flexible BI and analytics platform, leveraging business user personas is crucial to ensure the tool meets the diverse needs of its users.
     
    Understanding Business User Personas
    A business user persona is a semi-fictional representation of a segment of users within an organization. These personas are developed based on user research and accurate data about the users, including their behaviour patterns, motivations, goals, and challenges. By creating detailed personas, organizations can gain insights into the different types of users who will interact with their BI platform, ensuring that the design and functionality align with their specific needs.
     
    The Role of Business User Personas in WebFOCUS
     
    1. Enhanced User Experience
    WebFOCUS aims to provide a seamless user experience, enabling users to easily access, analyze, and visualize data. By developing business user personas, the design and development teams can tailor the interface and features to meet the distinct needs of each user group. For instance, a persona representing a financial analyst might prioritize advanced data visualization tools and detailed financial reports. In contrast, a persona for a sales manager might focus on quick access to sales metrics and customizable dashboards.
     
    2. Targeted Functionality
    Different users have varying requirements from a BI platform. Business user personas help identify these unique needs, allowing WebFOCUS to offer targeted functionality. For example, an IT administrator persona might need robust data governance and security features, while a business executive persona might need high-level insights and automated reporting. By catering to these specific needs, WebFOCUS can ensure higher user satisfaction and adoption rates.
     
    3. Improved Communication and Marketing
    Creating business user personas helps WebFOCUS' marketing and sales teams communicate more effectively with potential clients. Understanding the pain points and goals of each persona allows for more personalized and relevant messaging, increasing the likelihood of engagement. It also aids in demonstrating the value of WebFOCUS to different organizational stakeholders, showcasing how the platform can address their unique challenges.
     
    4. Informed Product Development
    Product development teams benefit significantly from business user personas. These personas provide a clear roadmap for prioritizing features and improvements based on actual user needs. For WebFOCUS, this means focusing on developing tools and functionalities that offer the most value to its diverse user base, which includes data scientists and analysts as well as business executives and IT professionals.
     
    5. Increased User Adoption and Retention
    When users find that a BI platform like WebFOCUS meets their specific needs and aligns with their workflows, they are more likely to adopt and consistently use the platform. Business user personas ensure the platform is intuitive and valuable for each user group, fostering higher adoption rates and long-term user retention. Satisfied users are also more likely to advocate for the platform within their organization, driving further growth and engagement.
     
    Importance of Business User Personas for WebFOCUS
     
    The importance of business user personas for WebFOCUS cannot be overstated. Here are a few key reasons why:
    1. User-Centric Design: Personas ensure that WebFOCUS's design and functionality are centered around the actual users, leading to a more intuitive and efficient user experience.
    2. Strategic Decision Making: Personas provide insights that inform strategic decisions in product development, marketing, and customer support, ensuring that resources are allocated effectively.
    3. Competitive Advantage: By thoroughly understanding and addressing the needs of different user groups, WebFOCUS can differentiate itself from competitors, offering a more tailored and effective BI solution.
    4. Enhanced Collaboration: Personas facilitate better communication and collaboration among cross-functional teams, ensuring that everyone is aligned on user needs and priorities.
    5. Long-Term Success: Ultimately, business user personas contribute to the long-term success of WebFOCUS by ensuring that the platform evolves in line with user needs and market trends, maintaining its relevance and value.
     
    Conclusion
    Incorporating business user personas into the development and marketing strategies of WebFOCUS is essential for delivering a BI platform that genuinely meets the diverse needs of its users. WebFOCUS can enhance user experience, drive adoption and retention, and maintain a competitive edge in the BI market by understanding and addressing the specific requirements of different user groups. As organizations continue to rely on data-driven insights for decision-making, the role of user personas in shaping effective BI solutions will only grow in importance.
  • On July 25, 2024, ibi announced the launch of the new ibi Platform and associated offerings: ibi Analytics, ibi Mainframe, and ibi Data Intelligence. As part of this announcement, ibi products–including WebFOCUS and FOCUS–continue to be offered to our customers through these new platform offerings.
    In conjunction with the release of the new ibi platform offerings, our customers received product retirement and end of sale notices for specific stand-alone product SKUs. Rest assured, our key offerings are available in the ibi platform offerings designed to maximize value for our customers through unlimited users, usage, and environments.
    The new platform offerings are as follows:
    ibi Analytics contains products from the following previous SKUs:
    ibi WebFOCUS - Basic/Standard/Enterprise Editions ibi WebFOCUS - Container Edition ibi Data Migrator ibi WebFOCUS Omni Application Premium Adapter ibi Mainframe contains products from the following previous SKUs:
    ibi FOCUS ibi Open Data Hub for Mainframe ibi iWay Service Manager ibi iWay Service Manager EDI Add-on (Trading Partner Manager) ibi WebFOCUS - Basic/Standard/Enterprise Editions ibi Data Migrator ibi WebFOCUS Omni Premium Mainframe Adapter ibi Data Intelligence contains products from the following previous SKUs:
    ibi Data Intelligence - Basic/Standard/Enterprise Editions ibi Data Quality ibi iWay Service Manager ibi iWay Service Manager EDI Add-on (Trading Partner Manager) ibi Data Migrator ibi WebFOCUS Omni Application Premium Adapter Additionally, ibi Address Service is available as an add-on to ibi Data Intelligence for address verification within Data Quality. More information about the new offerings can be found in this ibi Platform announcement blog by Vijay Raman.
    As of September 1, 2024, all renewals and new subscriptions will align to the new offerings above. Certain public sector customers may be subject to an exception to this change. 
    Additional information about the earlier product retirement and end of sale notices may be found in the articles 000053903 and 000053904, at http://support.tibco.com. 
    If you have additional questions, please contact your account manager or reach out via Contact Us page.
     
  • How WebFOCUS Integrates Predictive Analytics
    Data Integration and Preparation
    WebFOCUS excels in integrating and preparing data from diverse sources. Before predictive modeling can be performed, data must be cleaned, transformed, and aggregated. WebFOCUS provides powerful ETL (extract, transform, load) tools that ensure your data is accurate and ready for analysis. The platform supports integration with various data sources, including databases, cloud services, and big data environments.
    Predictive Modeling - Machine Learning
    WebFOCUS includes built-in predictive modeling capabilities that allow users to build and deploy machine learning models directly within the platform. The integration of predictive analytics features simplifies the process of creating algorithms to forecast trends and outcomes. Users can choose from a range of machine learning techniques, including regression analysis, binary classification, anomaly detection, clustering, and time-series forecasting.
    User-Friendly Visualization and Reporting
    Visualization is a crucial aspect of predictive analytics. WebFOCUS provides a rich set of visualizations that help users interpret complex data and predictive models easily. From interactive dashboards to detailed reports, WebFOCUS ensures that predictions are presented in a clear and actionable format. This allows stakeholders to make informed decisions based on the insights generated by predictive models.
    Practical Applications of Predictive Analytics with WebFOCUS
    Customer Behavior Analysis
    By analyzing historical customer data, WebFOCUS predictive analytics can help businesses understand purchasing patterns, segment customers, and forecast future buying behaviors. This insight can drive targeted marketing campaigns, improve customer retention, and enhance overall customer experience.
    Operational Efficiency
    Predictive analytics can be applied to optimize operational processes. For example, businesses can forecast inventory levels, predict equipment maintenance needs, and streamline supply chain operations. WebFOCUS helps organizations anticipate potential disruptions and make proactive adjustments.
    Financial Forecasting
    Financial planning and analysis benefit greatly from predictive analytics. WebFOCUS enables accurate forecasting of revenue, expenses, and cash flow. By analyzing financial trends and market conditions, businesses can make strategic investment decisions and manage financial risks effectively.
    Risk Management
    Identifying and mitigating risks is crucial for any organization. WebFOCUS predictive analytics can assess potential risks and vulnerabilities by analyzing historical data and external factors. This proactive approach helps businesses develop strategies to minimize risks and protect their assets.
    Getting Started with Predictive Analytics in WebFOCUS
    To leverage predictive analytics in WebFOCUS, follow these steps:
    Define Your Objectives
    Clearly outline the goals of your predictive analytics project. What questions are you trying to answer? What outcomes are you looking to achieve?
    Prepare Your Data
    Use WebFOCUS’s data integration tools to gather and clean your data. Ensure that your dataset is comprehensive and relevant to your predictive modeling needs.
    Train Machine Learning Models
    Utilize WebFOCUS’s predictive modeling capabilities to create and test your models. Choose the appropriate algorithms and techniques based on your objectives and data characteristics.
    Visualize and Interpret Results
    Create visualizations and reports to present your predictive insights. Make sure the results are understandable and actionable for your stakeholders.
    Run Machine Learning Models
    Embed predictive models into your business processes. Adjust and refine models as needed based on new data and changing conditions.
    Conclusion
    Predictive analytics is a game-changer for businesses seeking to stay ahead in a competitive landscape. With WebFOCUS, organizations can seamlessly integrate predictive analytics into their data strategy, unlocking valuable insights and making more informed decisions. By leveraging WebFOCUS’s advanced capabilities, businesses can transform data into a strategic asset, driving growth and innovation.
     
  • Taking Control of Your Data with Ease
    In the dynamic world of data manipulation and report generation, having the ability to fine-tune settings is essential for achieving desired results. Traditionally, managing SET commands in WebFOCUS has been a complex process requiring manual adjustments outside the Designer interface. To address this, we're excited to introduce a new feature in v9.3.0: the SET Command can now be accessed from the Designer UI via a dialog box while authoring content.

    Introducing the SET Command Dialog Box
    The new SET Command feature can be accessed through a dedicated icon on the top toolbar. Clicking this icon opens the "Manage SET Command" dialog box, which offers a comprehensive list of SET commands and their corresponding options. This update aims to provide users with more control and flexibility without needing to resort to text editors, thus maintaining the integrity and re-openability of procedures in Designer.


    Key Features and Benefits
    Comprehensive Command List: The dialog box presents a comprehensive list of commonly used SET commands, allowing you to easily identify and modify the ones that impact your data. Please note that this list is not exhaustive, and additional SET commands can be added upon request. Intuitive Interface: With clear explanations and default values, the dialog box simplifies the configuration process, even for users new to SET commands. Enhanced Flexibility: Gain greater control over your reports by customizing settings like data formatting, display options, and more. Improved Efficiency: Save time and effort by managing SET commands directly within Designer, eliminating the need for external adjustments. Maintainable Procedures: Ensure the integrity and re-usability of your Designer procedures by keeping all settings within the interface.
    Understanding Key SET Commands
    To provide a better understanding of the SET commands available, here's a brief overview of some commonly used options:
    ASNAMES: Controls the display of aliases in reports. BASEURL: Specifies the base URL for relative links in HTML output. CDN: Defines the character used to separate column values in CSV output. COLLATION: Sets the character collation for data sorting and comparison. CSSURL: Specifies the URL for an external CSS stylesheet. DEFCENT: Determines the default decimal place for numeric values. DRILLMETHOD: Specifies the method used for drill-through actions. EMBEDHEADING: Controls whether to embed the report heading in the HTML output. EMPTYREPORT: Defines the content of an empty report. HIDENULLACRS: Determines whether to hide null values in cross-tab reports. HOLDLIST: Controls the behavior of the hold list in interactive reports. HTMLEMBEDIMG: Specifies how images are embedded in HTML output. HTMLENCODE: Controls HTML encoding of characters. JPEGQUALITY: Sets the JPEG image quality for output. JSURLS: Specifies URLs for external JavaScript files. LANG: Sets the language for report output. MESSAGE: Controls the display of error messages. MISSING: Determines how missing values are handled. NODATA: Specifies the content to display when there is no data. PRINTPLUS: Controls additional printing options. RANK: Defines the method used for ranking data. SHOWBLANKS: Determines whether to display blank rows or columns. TITLES: Controls the display of report titles. UNITS: Specifies the measurement units for report output. YRTHRESH: Sets the threshold for year-based calculations. User Interaction and Reset Functionality
    The SET Command feature is available in Author mode and Document mode but not in Assemble only mode. SET commands apply at the page level for authored pages, ensuring consistency across items. The same applies to Document mode. Users have the ability to view and modify previously set commands. A "Reset" button is available to revert all commands to their default state. An info icon is integrated into the UI, providing descriptions of SET commands similar to the functionality seen in App Studio. This help feature remains visible while interacting with each command. Empowering Users
    By placing SET command management directly within the Designer interface that can be accessed while authoring content, we aim to empower users to create more effective and tailored reports and charts. This new feature is a significant step towards enhancing user experience and productivity within Designer.
     
×
  • Create New...